Apr 21 10:03:42.039445 ip-10-0-140-231 systemd[1]: Starting Kubernetes Kubelet... Apr 21 10:03:42.488536 ip-10-0-140-231 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:42.488936 ip-10-0-140-231 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 10:03:42.488936 ip-10-0-140-231 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:42.488936 ip-10-0-140-231 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 10:03:42.488936 ip-10-0-140-231 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:42.489481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.489412 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 10:03:42.493286 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493272 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493287 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493292 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493296 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493299 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493302 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493305 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493308 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493310 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493313 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493315 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493318 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493322 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493324 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493327 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:42.493325 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493331 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493335 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493338 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493340 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493343 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493346 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493348 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493352 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493354 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493357 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493359 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493362 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493367 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493370 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493373 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493375 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493378 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493380 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493383 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493386 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:42.493686 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493389 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493391 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493394 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493396 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493399 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493401 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493404 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493406 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493409 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493411 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493413 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493416 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493418 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493421 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493424 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493427 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493431 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493434 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493436 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493439 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:42.494204 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493442 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493444 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493447 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493449 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493452 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493454 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493457 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493459 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493462 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493464 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493467 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493469 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493472 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493474 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493477 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493479 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493482 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493484 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493487 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:42.494694 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493489 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493500 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493503 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493505 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493507 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493510 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493513 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493516 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493519 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493521 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493524 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493527 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493945 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493950 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493953 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493956 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493961 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493965 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493969 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:42.495188 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493972 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493975 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493978 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493981 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493983 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493986 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493989 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493992 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493994 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493996 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.493999 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494001 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494004 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494006 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494015 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494017 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494020 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494023 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494026 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494028 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:42.495658 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494031 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494034 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494037 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494039 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494041 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494044 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494046 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494049 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494052 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494055 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494058 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494060 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494062 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494065 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494067 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494070 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494072 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494075 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494077 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494080 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:42.496179 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494082 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494085 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494087 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494089 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494092 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494094 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494097 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494105 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494107 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494110 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494112 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494116 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494118 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494121 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494124 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494126 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494129 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494132 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494135 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494137 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:42.496678 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494140 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494142 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494145 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494148 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494150 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494153 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494155 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494157 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494160 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494162 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494188 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494191 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494193 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494196 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494199 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494202 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494205 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494207 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494210 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494277 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 10:03:42.497175 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494296 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494309 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494314 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494318 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494322 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494326 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494330 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494333 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494336 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494340 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494343 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494346 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494349 2567 flags.go:64] FLAG: --cgroup-root="" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494352 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494355 2567 flags.go:64] FLAG: --client-ca-file="" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494357 2567 flags.go:64] FLAG: --cloud-config="" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494360 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494363 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494370 2567 flags.go:64] FLAG: --cluster-domain="" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494373 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494376 2567 flags.go:64] FLAG: --config-dir="" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494378 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494382 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494386 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 10:03:42.497668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494388 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494391 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494395 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494398 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494401 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494404 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494406 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494409 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494414 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494422 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494426 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494432 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494435 2567 flags.go:64] FLAG: --enable-server="true" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494438 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494457 2567 flags.go:64] FLAG: --event-burst="100" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494461 2567 flags.go:64] FLAG: --event-qps="50" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494465 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494468 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494471 2567 flags.go:64] FLAG: --eviction-hard="" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494474 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494477 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494480 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494483 2567 flags.go:64] FLAG: --eviction-soft="" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494487 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494489 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 10:03:42.498280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494492 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494495 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494498 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494501 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494504 2567 flags.go:64] FLAG: --feature-gates="" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494507 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494510 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494513 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494516 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494519 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494522 2567 flags.go:64] FLAG: --help="false" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494525 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-140-231.ec2.internal" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494528 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494531 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494533 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494537 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494540 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494548 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494553 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494556 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494559 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494561 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494564 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494567 2567 flags.go:64] FLAG: --kube-reserved="" Apr 21 10:03:42.498860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494570 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494573 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494577 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494582 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494587 2567 flags.go:64] FLAG: --lock-file="" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494592 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494596 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494600 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494609 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494613 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494617 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494620 2567 flags.go:64] FLAG: --logging-format="text" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494622 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494626 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494628 2567 flags.go:64] FLAG: --manifest-url="" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494631 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494636 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494639 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494643 2567 flags.go:64] FLAG: --max-pods="110" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494646 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494649 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494652 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494655 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494658 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494660 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 10:03:42.499456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494663 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494678 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494681 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494685 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494688 2567 flags.go:64] FLAG: --pod-cidr="" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494691 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494696 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494699 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494702 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494705 2567 flags.go:64] FLAG: --port="10250" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494708 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494711 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0645ccd9262c15355" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494715 2567 flags.go:64] FLAG: --qos-reserved="" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494718 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494721 2567 flags.go:64] FLAG: --register-node="true" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494724 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494726 2567 flags.go:64] FLAG: --register-with-taints="" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494730 2567 flags.go:64] FLAG: --registry-burst="10" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494733 2567 flags.go:64] FLAG: --registry-qps="5" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494736 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494738 2567 flags.go:64] FLAG: --reserved-memory="" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494742 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494745 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494748 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494751 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 10:03:42.500057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494754 2567 flags.go:64] FLAG: --runonce="false" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494756 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494759 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494762 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494765 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494767 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494770 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494773 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494776 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494786 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494789 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494792 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494795 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494797 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494800 2567 flags.go:64] FLAG: --system-cgroups="" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494803 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494808 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494815 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494818 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494824 2567 flags.go:64] FLAG: --tls-min-version="" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494827 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494830 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494832 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494835 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494838 2567 flags.go:64] FLAG: --v="2" Apr 21 10:03:42.500697 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494842 2567 flags.go:64] FLAG: --version="false" Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494847 2567 flags.go:64] FLAG: --vmodule="" Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494851 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.494854 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494965 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494969 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494972 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494975 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494978 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494981 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494984 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494988 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494991 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494994 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494997 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.494999 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495003 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495011 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495014 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495017 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:42.501742 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495019 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495021 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495024 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495026 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495030 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495033 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495035 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495037 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495040 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495042 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495045 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495047 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495050 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495052 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495055 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495057 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495061 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495065 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495068 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:42.502719 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495071 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495073 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495076 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495079 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495082 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495084 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495087 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495089 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495092 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495095 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495098 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495106 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495109 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495111 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495113 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495116 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495118 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495122 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495125 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495128 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:42.503234 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495130 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495132 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495135 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495137 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495140 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495142 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495145 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495148 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495150 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495153 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495155 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495157 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495160 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495163 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495179 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495181 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495184 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495187 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495190 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495193 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:42.503723 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495196 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495199 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495202 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495205 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495214 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495217 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495219 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495222 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495225 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495228 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.495231 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.496079 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.503901 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.503923 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.503997 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:42.504481 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504005 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504012 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504021 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504026 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504032 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504037 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504041 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504046 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504051 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504055 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504059 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504063 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504067 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504072 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504076 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504080 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504084 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504088 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504094 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504098 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:42.505142 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504102 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504106 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504110 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504115 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504119 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504123 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504127 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504131 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504136 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504140 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504146 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504150 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504155 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504159 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504163 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504185 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504190 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504195 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504199 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504204 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:42.505656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504208 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504212 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504216 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504220 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504224 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504228 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504232 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504237 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504241 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504245 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504250 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504254 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504258 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504262 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504266 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504270 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504274 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504281 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504287 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504293 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:42.506250 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504297 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504301 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504306 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504311 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504315 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504320 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504324 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504329 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504333 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504337 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504341 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504345 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504349 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504353 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504358 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504362 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504366 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504370 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504374 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:42.506990 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504378 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504382 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504386 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504391 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504395 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504399 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.504407 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504570 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504580 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504585 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504590 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504594 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504599 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504604 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504609 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:42.507479 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504614 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504620 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504636 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504641 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504645 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504650 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504655 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504659 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504664 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504668 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504673 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504677 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504681 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504685 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504690 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504694 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504698 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504702 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504706 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504711 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:42.508077 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504715 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504719 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504723 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504727 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504731 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504736 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504740 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504744 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504748 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504752 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504756 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504761 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504765 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504769 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504773 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504779 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504783 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504787 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504791 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:42.508656 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504798 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504802 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504806 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504811 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504815 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504820 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504824 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504828 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504832 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504836 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504840 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504844 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504848 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504852 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504856 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504860 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504864 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504869 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504873 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504877 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:42.509112 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504881 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504885 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504889 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504893 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504897 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504902 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504906 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504910 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504914 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504920 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504924 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504928 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504932 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504936 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504939 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504943 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504947 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504951 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:42.509790 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:42.504955 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:42.510374 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.504963 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:42.510374 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.505691 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 10:03:42.510374 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.510066 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 10:03:42.511070 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.511058 2567 server.go:1019] "Starting client certificate rotation" Apr 21 10:03:42.511185 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.511154 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:42.511225 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.511209 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:42.542891 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.542870 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:42.548326 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.548086 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:42.564953 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.564937 2567 log.go:25] "Validated CRI v1 runtime API" Apr 21 10:03:42.572074 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.572059 2567 log.go:25] "Validated CRI v1 image API" Apr 21 10:03:42.573248 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.573235 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 10:03:42.576749 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.576729 2567 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 ad532456-408b-42a1-8880-2729e3cc153b:/dev/nvme0n1p4 f0480994-74b0-4a93-9d24-b2e1fb1cbb46:/dev/nvme0n1p3] Apr 21 10:03:42.576813 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.576748 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 10:03:42.580623 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.580603 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:42.582953 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.582845 2567 manager.go:217] Machine: {Timestamp:2026-04-21 10:03:42.580446211 +0000 UTC m=+0.425641707 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096815 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec226abc839152e53dfd40ac548af390 SystemUUID:ec226abc-8391-52e5-3dfd-40ac548af390 BootID:a4089522-9878-4aec-8356-926b1969617f Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:19:c8:34:f9:3d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:19:c8:34:f9:3d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:96:07:aa:c7:a6:88 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 10:03:42.582953 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.582951 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 10:03:42.583061 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.583022 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 10:03:42.585851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.585827 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 10:03:42.586002 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.585853 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-231.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 10:03:42.586050 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.586012 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 10:03:42.586050 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.586021 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 10:03:42.586050 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.586040 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:42.587091 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.587081 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:42.587952 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.587943 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:42.588047 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.588038 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 10:03:42.590591 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.590581 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 21 10:03:42.590633 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.590594 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 10:03:42.590633 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.590607 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 10:03:42.590633 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.590619 2567 kubelet.go:397] "Adding apiserver pod source" Apr 21 10:03:42.590633 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.590628 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 10:03:42.591905 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.591892 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:42.591958 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.591911 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:42.594975 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.594961 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 10:03:42.596814 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.596800 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 10:03:42.598725 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598712 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 10:03:42.598790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598730 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 10:03:42.598790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598737 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 10:03:42.598790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598742 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 10:03:42.598790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598748 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 10:03:42.598790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598755 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 10:03:42.598790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598761 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 10:03:42.598790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598766 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 10:03:42.598790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598773 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 10:03:42.598790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598779 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 10:03:42.598790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598794 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 10:03:42.599038 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.598804 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 10:03:42.600203 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.600193 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 10:03:42.600203 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.600203 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 10:03:42.604993 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.604968 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 10:03:42.605086 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.605018 2567 server.go:1295] "Started kubelet" Apr 21 10:03:42.605316 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.605275 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 10:03:42.605539 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.605500 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 10:03:42.605638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.605557 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 10:03:42.605753 ip-10-0-140-231 systemd[1]: Started Kubernetes Kubelet. Apr 21 10:03:42.607130 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.607107 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 10:03:42.607230 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.607194 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-231.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 10:03:42.607318 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.607289 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-231.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 10:03:42.607509 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.607484 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 10:03:42.609207 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.609189 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 21 10:03:42.614544 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.614524 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:42.615278 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.615264 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 10:03:42.616355 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.616336 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:42.616448 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616384 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 10:03:42.616448 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616389 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 10:03:42.616448 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616392 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 10:03:42.616448 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616403 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 10:03:42.616448 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616406 2567 factory.go:55] Registering systemd factory Apr 21 10:03:42.616448 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616415 2567 factory.go:223] Registration of the systemd container factory successfully Apr 21 10:03:42.616701 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616482 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 21 10:03:42.616701 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616491 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 21 10:03:42.616701 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.616636 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 10:03:42.616701 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616688 2567 factory.go:153] Registering CRI-O factory Apr 21 10:03:42.616701 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616702 2567 factory.go:223] Registration of the crio container factory successfully Apr 21 10:03:42.616859 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616724 2567 factory.go:103] Registering Raw factory Apr 21 10:03:42.616859 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.616735 2567 manager.go:1196] Started watching for new ooms in manager Apr 21 10:03:42.617122 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.617107 2567 manager.go:319] Starting recovery of all containers Apr 21 10:03:42.621115 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.621089 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-231.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 10:03:42.621485 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.621460 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 10:03:42.622610 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.621219 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-231.ec2.internal.18a8571a196cc77f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-231.ec2.internal,UID:ip-10-0-140-231.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-231.ec2.internal,},FirstTimestamp:2026-04-21 10:03:42.604986239 +0000 UTC m=+0.450181739,LastTimestamp:2026-04-21 10:03:42.604986239 +0000 UTC m=+0.450181739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-231.ec2.internal,}" Apr 21 10:03:42.626801 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.626785 2567 manager.go:324] Recovery completed Apr 21 10:03:42.630976 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.630964 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.633597 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.633574 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.633680 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.633609 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.633680 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.633619 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.634074 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.634059 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 10:03:42.634074 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.634072 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 10:03:42.634157 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.634087 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:42.636281 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.636223 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-231.ec2.internal.18a8571a1b2153a4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-231.ec2.internal,UID:ip-10-0-140-231.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-231.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-231.ec2.internal,},FirstTimestamp:2026-04-21 10:03:42.633595812 +0000 UTC m=+0.478791309,LastTimestamp:2026-04-21 10:03:42.633595812 +0000 UTC m=+0.478791309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-231.ec2.internal,}" Apr 21 10:03:42.636361 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.636333 2567 policy_none.go:49] "None policy: Start" Apr 21 10:03:42.636361 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.636346 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 10:03:42.636361 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.636354 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 21 10:03:42.644858 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.644803 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-231.ec2.internal.18a8571a1b219976 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-231.ec2.internal,UID:ip-10-0-140-231.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-140-231.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-140-231.ec2.internal,},FirstTimestamp:2026-04-21 10:03:42.633613686 +0000 UTC m=+0.478809183,LastTimestamp:2026-04-21 10:03:42.633613686 +0000 UTC m=+0.478809183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-231.ec2.internal,}" Apr 21 10:03:42.653228 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.653209 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rq2c4" Apr 21 10:03:42.654208 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.654138 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-231.ec2.internal.18a8571a1b21bddb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-231.ec2.internal,UID:ip-10-0-140-231.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-140-231.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-140-231.ec2.internal,},FirstTimestamp:2026-04-21 10:03:42.633623003 +0000 UTC m=+0.478818501,LastTimestamp:2026-04-21 10:03:42.633623003 +0000 UTC m=+0.478818501,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-231.ec2.internal,}" Apr 21 10:03:42.663772 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.663754 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rq2c4" Apr 21 10:03:42.675610 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.672731 2567 manager.go:341] "Starting Device Plugin manager" Apr 21 10:03:42.675610 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.672755 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 10:03:42.675610 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.672764 2567 server.go:85] "Starting device plugin registration server" Apr 21 10:03:42.675610 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.672958 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 10:03:42.675610 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.672966 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 10:03:42.675610 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.673125 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 10:03:42.675610 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.673223 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 10:03:42.675610 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.673230 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 10:03:42.675610 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.673586 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 10:03:42.675610 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.673612 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:42.708436 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.708414 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 10:03:42.709466 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.709450 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 10:03:42.709552 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.709474 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 10:03:42.709552 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.709493 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 10:03:42.709552 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.709499 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 10:03:42.709552 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.709528 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 10:03:42.712656 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.712640 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:42.773852 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.773806 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.774567 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.774546 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.774620 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.774579 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.774620 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.774589 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.774620 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.774608 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-231.ec2.internal" Apr 21 10:03:42.784047 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.784020 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-231.ec2.internal" Apr 21 10:03:42.784047 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.784042 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-231.ec2.internal\": node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:42.803423 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.803402 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:42.810489 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.810471 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-231.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal"] Apr 21 10:03:42.810541 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.810518 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.811801 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.811787 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.811863 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.811813 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.811863 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.811827 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.812778 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.812766 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.813436 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.813416 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.813510 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.813447 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.813510 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.813460 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.813510 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.813477 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-231.ec2.internal" Apr 21 10:03:42.813510 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.813503 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.814138 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.814124 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.814230 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.814184 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.814230 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.814199 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.814511 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.814498 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" Apr 21 10:03:42.814567 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.814522 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:42.815117 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.815102 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:42.815211 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.815128 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:42.815211 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.815138 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:42.816973 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.816954 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3d4599cdcd53cd71a013b6e6939ddb90-config\") pod \"kube-apiserver-proxy-ip-10-0-140-231.ec2.internal\" (UID: \"3d4599cdcd53cd71a013b6e6939ddb90\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-231.ec2.internal" Apr 21 10:03:42.834697 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.834674 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-231.ec2.internal\" not found" node="ip-10-0-140-231.ec2.internal" Apr 21 10:03:42.838348 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.838331 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-231.ec2.internal\" not found" node="ip-10-0-140-231.ec2.internal" Apr 21 10:03:42.904463 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:42.904439 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:42.917730 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.917709 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0bc0d1d980f249ac792c6594624c04d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal\" (UID: \"c0bc0d1d980f249ac792c6594624c04d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" Apr 21 10:03:42.917813 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.917760 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3d4599cdcd53cd71a013b6e6939ddb90-config\") pod \"kube-apiserver-proxy-ip-10-0-140-231.ec2.internal\" (UID: \"3d4599cdcd53cd71a013b6e6939ddb90\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-231.ec2.internal" Apr 21 10:03:42.917813 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.917787 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c0bc0d1d980f249ac792c6594624c04d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal\" (UID: \"c0bc0d1d980f249ac792c6594624c04d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" Apr 21 10:03:42.917889 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:42.917840 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3d4599cdcd53cd71a013b6e6939ddb90-config\") pod \"kube-apiserver-proxy-ip-10-0-140-231.ec2.internal\" (UID: \"3d4599cdcd53cd71a013b6e6939ddb90\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-231.ec2.internal" Apr 21 10:03:43.004801 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:43.004762 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:43.018086 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.018065 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c0bc0d1d980f249ac792c6594624c04d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal\" (UID: \"c0bc0d1d980f249ac792c6594624c04d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" Apr 21 10:03:43.018140 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.018126 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0bc0d1d980f249ac792c6594624c04d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal\" (UID: \"c0bc0d1d980f249ac792c6594624c04d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" Apr 21 10:03:43.018199 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.018147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0bc0d1d980f249ac792c6594624c04d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal\" (UID: \"c0bc0d1d980f249ac792c6594624c04d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" Apr 21 10:03:43.018199 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.018160 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c0bc0d1d980f249ac792c6594624c04d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal\" (UID: \"c0bc0d1d980f249ac792c6594624c04d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" Apr 21 10:03:43.105497 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:43.105446 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:43.136906 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.136886 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-231.ec2.internal" Apr 21 10:03:43.141285 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.141265 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" Apr 21 10:03:43.205850 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:43.205823 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:43.306392 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:43.306366 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:43.406859 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:43.406836 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:43.504955 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.504935 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:43.507621 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:43.507602 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:43.510761 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.510744 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 10:03:43.510871 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.510855 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:43.510930 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.510907 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:43.603119 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:43.603088 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0bc0d1d980f249ac792c6594624c04d.slice/crio-635c481d7cd3fe056d4f0c6edf18349f75a0c7d25ed1b9377dfaf3e2c861b64f WatchSource:0}: Error finding container 635c481d7cd3fe056d4f0c6edf18349f75a0c7d25ed1b9377dfaf3e2c861b64f: Status 404 returned error can't find the container with id 635c481d7cd3fe056d4f0c6edf18349f75a0c7d25ed1b9377dfaf3e2c861b64f Apr 21 10:03:43.603353 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:43.603337 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d4599cdcd53cd71a013b6e6939ddb90.slice/crio-4afb88d5e5d5ca6c2dd9e97dcc7959e236d21300d44effbd7e9862b9ada43267 WatchSource:0}: Error finding container 4afb88d5e5d5ca6c2dd9e97dcc7959e236d21300d44effbd7e9862b9ada43267: Status 404 returned error can't find the container with id 4afb88d5e5d5ca6c2dd9e97dcc7959e236d21300d44effbd7e9862b9ada43267 Apr 21 10:03:43.607230 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.607211 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:03:43.607655 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:43.607636 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:43.615317 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.615302 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:43.625515 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.625497 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:43.646921 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.646903 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hkj6f" Apr 21 10:03:43.654770 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.654754 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hkj6f" Apr 21 10:03:43.666181 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.666110 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 09:58:42 +0000 UTC" deadline="2027-10-01 00:09:16.075091182 +0000 UTC" Apr 21 10:03:43.666181 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.666148 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12662h5m32.408945057s" Apr 21 10:03:43.708405 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:43.708387 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:43.712115 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.712079 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" event={"ID":"c0bc0d1d980f249ac792c6594624c04d","Type":"ContainerStarted","Data":"635c481d7cd3fe056d4f0c6edf18349f75a0c7d25ed1b9377dfaf3e2c861b64f"} Apr 21 10:03:43.712972 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.712942 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-231.ec2.internal" event={"ID":"3d4599cdcd53cd71a013b6e6939ddb90","Type":"ContainerStarted","Data":"4afb88d5e5d5ca6c2dd9e97dcc7959e236d21300d44effbd7e9862b9ada43267"} Apr 21 10:03:43.808543 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:43.808510 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:43.859868 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.859846 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:43.908748 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:43.908718 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-231.ec2.internal\" not found" Apr 21 10:03:43.947860 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:43.947809 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:44.015675 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.015646 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-231.ec2.internal" Apr 21 10:03:44.024431 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.024410 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:44.025365 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.025340 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" Apr 21 10:03:44.043870 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.043850 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:44.367505 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.367441 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:44.591902 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.591870 2567 apiserver.go:52] "Watching apiserver" Apr 21 10:03:44.599463 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.599437 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 10:03:44.599786 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.599763 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-s97qn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal","openshift-multus/multus-bnl68","openshift-network-operator/iptables-alerter-jzz4r","kube-system/kube-apiserver-proxy-ip-10-0-140-231.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n","openshift-cluster-node-tuning-operator/tuned-jj5fc","openshift-image-registry/node-ca-wgkxz","openshift-multus/multus-additional-cni-plugins-rnvsp","openshift-multus/network-metrics-daemon-nwsw4","openshift-network-diagnostics/network-check-target-jzrcz","openshift-ovn-kubernetes/ovnkube-node-tqctk","kube-system/konnectivity-agent-hk9xb"] Apr 21 10:03:44.604743 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.604714 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wgkxz" Apr 21 10:03:44.606795 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.606768 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:03:44.607253 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.607229 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 10:03:44.607586 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.607564 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-27kw6\"" Apr 21 10:03:44.607683 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.607620 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.607683 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.607650 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.609038 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.608937 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:44.609038 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.608947 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 10:03:44.609038 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.608952 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gqrt7\"" Apr 21 10:03:44.609038 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.609033 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 10:03:44.609313 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:44.609026 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:03:44.611213 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.611192 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s97qn" Apr 21 10:03:44.613266 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.613247 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.613370 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.613307 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.613428 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.613380 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qrxk4\"" Apr 21 10:03:44.613428 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.613385 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.615513 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.615492 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 10:03:44.615604 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.615518 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 10:03:44.615604 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.615492 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 10:03:44.615604 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.615498 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wwkdk\"" Apr 21 10:03:44.615763 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.615680 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.615816 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.615775 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.615816 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.615794 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.615920 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.615902 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.619201 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.617943 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.619201 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.618685 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.619201 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.618905 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5274v\"" Apr 21 10:03:44.619520 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.619210 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.619520 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.619232 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 10:03:44.619520 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.619473 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.619699 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.619527 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jzz4r" Apr 21 10:03:44.619699 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.619475 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7n9xf\"" Apr 21 10:03:44.621656 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.621638 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.621899 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.621877 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 10:03:44.621996 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.621977 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.622057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.621990 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-tgn75\"" Apr 21 10:03:44.623553 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.623536 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:44.623658 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:44.623593 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:03:44.624746 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624713 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.624746 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624744 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-systemd\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.624870 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624760 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-host\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.624870 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624776 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzfrd\" (UniqueName: \"kubernetes.io/projected/76f0dead-e62b-424c-a40e-6f82d52ba722-kube-api-access-rzfrd\") pod \"iptables-alerter-jzz4r\" (UID: \"76f0dead-e62b-424c-a40e-6f82d52ba722\") " pod="openshift-network-operator/iptables-alerter-jzz4r" Apr 21 10:03:44.624870 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624798 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a892f931-ea13-4698-b14e-4a1b739f586c-konnectivity-ca\") pod \"konnectivity-agent-hk9xb\" (UID: \"a892f931-ea13-4698-b14e-4a1b739f586c\") " pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:03:44.624870 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624828 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pvxt\" (UniqueName: \"kubernetes.io/projected/f5719a9a-0eff-48f5-b634-e4d0a7216828-kube-api-access-5pvxt\") pod \"node-ca-wgkxz\" (UID: \"f5719a9a-0eff-48f5-b634-e4d0a7216828\") " pod="openshift-image-registry/node-ca-wgkxz" Apr 21 10:03:44.624870 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624843 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-sysconfig\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.624870 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624859 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22eb8c40-ac2f-42ed-897c-2c7e11b8588c-hosts-file\") pod \"node-resolver-s97qn\" (UID: \"22eb8c40-ac2f-42ed-897c-2c7e11b8588c\") " pod="openshift-dns/node-resolver-s97qn" Apr 21 10:03:44.624870 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624872 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22eb8c40-ac2f-42ed-897c-2c7e11b8588c-tmp-dir\") pod \"node-resolver-s97qn\" (UID: \"22eb8c40-ac2f-42ed-897c-2c7e11b8588c\") " pod="openshift-dns/node-resolver-s97qn" Apr 21 10:03:44.625100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624889 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-registration-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.625100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624912 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-sys-fs\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.625100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.624970 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-cnibin\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.625100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625000 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-tmp\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.625100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625022 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76f0dead-e62b-424c-a40e-6f82d52ba722-host-slash\") pod \"iptables-alerter-jzz4r\" (UID: \"76f0dead-e62b-424c-a40e-6f82d52ba722\") " pod="openshift-network-operator/iptables-alerter-jzz4r" Apr 21 10:03:44.625100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625043 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5719a9a-0eff-48f5-b634-e4d0a7216828-serviceca\") pod \"node-ca-wgkxz\" (UID: \"f5719a9a-0eff-48f5-b634-e4d0a7216828\") " pod="openshift-image-registry/node-ca-wgkxz" Apr 21 10:03:44.625100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625071 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.625100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625092 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625107 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-var-lib-kubelet\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625137 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-tuned\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625157 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/76f0dead-e62b-424c-a40e-6f82d52ba722-iptables-alerter-script\") pod \"iptables-alerter-jzz4r\" (UID: \"76f0dead-e62b-424c-a40e-6f82d52ba722\") " pod="openshift-network-operator/iptables-alerter-jzz4r" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625191 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-socket-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625214 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mznx\" (UniqueName: \"kubernetes.io/projected/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-kube-api-access-2mznx\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625231 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-modprobe-d\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625252 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-sysctl-conf\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625270 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-lib-modules\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625284 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwvnb\" (UniqueName: \"kubernetes.io/projected/082858cf-9465-402e-85f9-8bab81da87b1-kube-api-access-nwvnb\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625297 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-system-cni-dir\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625318 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-sysctl-d\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625339 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-sys\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625359 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdzqg\" (UniqueName: \"kubernetes.io/projected/22eb8c40-ac2f-42ed-897c-2c7e11b8588c-kube-api-access-rdzqg\") pod \"node-resolver-s97qn\" (UID: \"22eb8c40-ac2f-42ed-897c-2c7e11b8588c\") " pod="openshift-dns/node-resolver-s97qn" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625372 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-cni-binary-copy\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.625409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625385 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-kubernetes\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.625912 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625423 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bphsl\" (UniqueName: \"kubernetes.io/projected/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-kube-api-access-bphsl\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.625912 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625459 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-device-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.625912 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625486 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-etc-selinux\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.625912 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625503 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dncx\" (UniqueName: \"kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx\") pod \"network-check-target-jzrcz\" (UID: \"5f39fe1c-70f8-4445-8cfd-646cb496d498\") " pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:44.625912 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625524 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-run\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.625912 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625540 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a892f931-ea13-4698-b14e-4a1b739f586c-agent-certs\") pod \"konnectivity-agent-hk9xb\" (UID: \"a892f931-ea13-4698-b14e-4a1b739f586c\") " pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:03:44.625912 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625554 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.625912 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625569 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5719a9a-0eff-48f5-b634-e4d0a7216828-host\") pod \"node-ca-wgkxz\" (UID: \"f5719a9a-0eff-48f5-b634-e4d0a7216828\") " pod="openshift-image-registry/node-ca-wgkxz" Apr 21 10:03:44.625912 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.625586 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-os-release\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.627052 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.627036 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.629102 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.629084 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 10:03:44.629184 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.629087 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dbqqd\"" Apr 21 10:03:44.629496 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.629476 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.631620 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.631600 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 10:03:44.631707 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.631663 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 10:03:44.632159 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.631838 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-988zs\"" Apr 21 10:03:44.632159 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.631883 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 10:03:44.632159 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.631915 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 10:03:44.632356 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.632194 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 10:03:44.632356 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.632242 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 10:03:44.655319 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.655293 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:43 +0000 UTC" deadline="2027-12-16 02:53:38.438826653 +0000 UTC" Apr 21 10:03:44.655415 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.655397 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14488h49m53.783433636s" Apr 21 10:03:44.717067 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.717043 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 10:03:44.725839 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.725819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkk8p\" (UniqueName: \"kubernetes.io/projected/dbb00fc1-1258-4254-a360-3c350554925b-kube-api-access-nkk8p\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:44.725941 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.725857 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-sysctl-d\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.725941 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.725884 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-multus-cni-dir\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.725941 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.725909 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-kubernetes\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.725941 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.725933 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-system-cni-dir\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.726145 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.725957 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-multus-socket-dir-parent\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.726145 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.725981 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-run-ovn-kubernetes\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.726145 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726006 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-run\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.726145 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726031 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a892f931-ea13-4698-b14e-4a1b739f586c-agent-certs\") pod \"konnectivity-agent-hk9xb\" (UID: \"a892f931-ea13-4698-b14e-4a1b739f586c\") " pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:03:44.726145 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726054 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.726145 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726080 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmljj\" (UniqueName: \"kubernetes.io/projected/f531049d-f18b-4c01-9df7-a6c394430f98-kube-api-access-hmljj\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.726145 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726103 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-node-log\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.726145 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726129 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5719a9a-0eff-48f5-b634-e4d0a7216828-host\") pod \"node-ca-wgkxz\" (UID: \"f5719a9a-0eff-48f5-b634-e4d0a7216828\") " pod="openshift-image-registry/node-ca-wgkxz" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726153 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-os-release\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726196 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726219 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-host\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-sys-fs\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726271 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-hostroot\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726296 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-multus-conf-dir\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726321 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-sysconfig\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726345 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-cnibin\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726371 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-os-release\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726424 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-systemd-units\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726441 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726457 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-host\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726463 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-run-systemd\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726468 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-sysconfig\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726511 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-kubernetes\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726515 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-sys-fs\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.726533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726505 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-run\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726548 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bdbce72-3342-4c5d-9e2f-6757d506d268-ovnkube-config\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5719a9a-0eff-48f5-b634-e4d0a7216828-host\") pod \"node-ca-wgkxz\" (UID: \"f5719a9a-0eff-48f5-b634-e4d0a7216828\") " pod="openshift-image-registry/node-ca-wgkxz" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726566 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726566 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-os-release\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726576 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76f0dead-e62b-424c-a40e-6f82d52ba722-host-slash\") pod \"iptables-alerter-jzz4r\" (UID: \"76f0dead-e62b-424c-a40e-6f82d52ba722\") " pod="openshift-network-operator/iptables-alerter-jzz4r" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726658 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76f0dead-e62b-424c-a40e-6f82d52ba722-host-slash\") pod \"iptables-alerter-jzz4r\" (UID: \"76f0dead-e62b-424c-a40e-6f82d52ba722\") " pod="openshift-network-operator/iptables-alerter-jzz4r" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726680 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-sysctl-d\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726684 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7f6x\" (UniqueName: \"kubernetes.io/projected/2bdbce72-3342-4c5d-9e2f-6757d506d268-kube-api-access-c7f6x\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726741 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5719a9a-0eff-48f5-b634-e4d0a7216828-serviceca\") pod \"node-ca-wgkxz\" (UID: \"f5719a9a-0eff-48f5-b634-e4d0a7216828\") " pod="openshift-image-registry/node-ca-wgkxz" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726802 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726811 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726842 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-socket-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726895 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-run-ovn\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726952 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-sysctl-conf\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.726990 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-lib-modules\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727002 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-socket-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.727245 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727017 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-etc-kubernetes\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727050 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-var-lib-openvswitch\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727075 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bdbce72-3342-4c5d-9e2f-6757d506d268-env-overrides\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727107 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-system-cni-dir\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727131 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-sysctl-conf\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727151 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-sys\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727159 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5719a9a-0eff-48f5-b634-e4d0a7216828-serviceca\") pod \"node-ca-wgkxz\" (UID: \"f5719a9a-0eff-48f5-b634-e4d0a7216828\") " pod="openshift-image-registry/node-ca-wgkxz" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727202 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdzqg\" (UniqueName: \"kubernetes.io/projected/22eb8c40-ac2f-42ed-897c-2c7e11b8588c-kube-api-access-rdzqg\") pod \"node-resolver-s97qn\" (UID: \"22eb8c40-ac2f-42ed-897c-2c7e11b8588c\") " pod="openshift-dns/node-resolver-s97qn" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727215 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-lib-modules\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727182 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-system-cni-dir\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727262 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-sys\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727266 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-cni-binary-copy\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727373 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bphsl\" (UniqueName: \"kubernetes.io/projected/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-kube-api-access-bphsl\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727425 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-device-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727470 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-etc-selinux\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727548 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-etc-selinux\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-device-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.728040 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727613 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-run-netns\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727643 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-cni-bin\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dncx\" (UniqueName: \"kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx\") pod \"network-check-target-jzrcz\" (UID: \"5f39fe1c-70f8-4445-8cfd-646cb496d498\") " pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727701 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f531049d-f18b-4c01-9df7-a6c394430f98-cni-binary-copy\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727799 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-var-lib-kubelet\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727824 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-systemd\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727844 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-cni-binary-copy\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727853 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzfrd\" (UniqueName: \"kubernetes.io/projected/76f0dead-e62b-424c-a40e-6f82d52ba722-kube-api-access-rzfrd\") pod \"iptables-alerter-jzz4r\" (UID: \"76f0dead-e62b-424c-a40e-6f82d52ba722\") " pod="openshift-network-operator/iptables-alerter-jzz4r" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727885 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727897 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a892f931-ea13-4698-b14e-4a1b739f586c-konnectivity-ca\") pod \"konnectivity-agent-hk9xb\" (UID: \"a892f931-ea13-4698-b14e-4a1b739f586c\") " pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.727940 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-systemd\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728053 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-slash\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728109 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pvxt\" (UniqueName: \"kubernetes.io/projected/f5719a9a-0eff-48f5-b634-e4d0a7216828-kube-api-access-5pvxt\") pod \"node-ca-wgkxz\" (UID: \"f5719a9a-0eff-48f5-b634-e4d0a7216828\") " pod="openshift-image-registry/node-ca-wgkxz" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22eb8c40-ac2f-42ed-897c-2c7e11b8588c-hosts-file\") pod \"node-resolver-s97qn\" (UID: \"22eb8c40-ac2f-42ed-897c-2c7e11b8588c\") " pod="openshift-dns/node-resolver-s97qn" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728197 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22eb8c40-ac2f-42ed-897c-2c7e11b8588c-tmp-dir\") pod \"node-resolver-s97qn\" (UID: \"22eb8c40-ac2f-42ed-897c-2c7e11b8588c\") " pod="openshift-dns/node-resolver-s97qn" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728221 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-registration-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728234 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22eb8c40-ac2f-42ed-897c-2c7e11b8588c-hosts-file\") pod \"node-resolver-s97qn\" (UID: \"22eb8c40-ac2f-42ed-897c-2c7e11b8588c\") " pod="openshift-dns/node-resolver-s97qn" Apr 21 10:03:44.728851 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728249 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-var-lib-cni-bin\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728277 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-var-lib-cni-multus\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728301 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-run-openvswitch\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728323 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-cni-netd\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728351 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-cnibin\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728381 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-tmp\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728433 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/082858cf-9465-402e-85f9-8bab81da87b1-registration-dir\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728470 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-run-k8s-cni-cncf-io\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728490 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22eb8c40-ac2f-42ed-897c-2c7e11b8588c-tmp-dir\") pod \"node-resolver-s97qn\" (UID: \"22eb8c40-ac2f-42ed-897c-2c7e11b8588c\") " pod="openshift-dns/node-resolver-s97qn" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728514 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f531049d-f18b-4c01-9df7-a6c394430f98-multus-daemon-config\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728529 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-cnibin\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728565 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-etc-openvswitch\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728591 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-log-socket\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728616 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bdbce72-3342-4c5d-9e2f-6757d506d268-ovnkube-script-lib\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728645 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728703 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-var-lib-kubelet\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728728 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-tuned\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.729638 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728753 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/76f0dead-e62b-424c-a40e-6f82d52ba722-iptables-alerter-script\") pod \"iptables-alerter-jzz4r\" (UID: \"76f0dead-e62b-424c-a40e-6f82d52ba722\") " pod="openshift-network-operator/iptables-alerter-jzz4r" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728767 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-var-lib-kubelet\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728781 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-run-multus-certs\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728805 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-kubelet\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728830 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-run-netns\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728871 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mznx\" (UniqueName: \"kubernetes.io/projected/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-kube-api-access-2mznx\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728896 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-modprobe-d\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728920 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwvnb\" (UniqueName: \"kubernetes.io/projected/082858cf-9465-402e-85f9-8bab81da87b1-kube-api-access-nwvnb\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728965 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.728991 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bdbce72-3342-4c5d-9e2f-6757d506d268-ovn-node-metrics-cert\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.729027 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a892f931-ea13-4698-b14e-4a1b739f586c-konnectivity-ca\") pod \"konnectivity-agent-hk9xb\" (UID: \"a892f931-ea13-4698-b14e-4a1b739f586c\") " pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.729032 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.729131 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.729270 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-modprobe-d\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.730268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.729605 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/76f0dead-e62b-424c-a40e-6f82d52ba722-iptables-alerter-script\") pod \"iptables-alerter-jzz4r\" (UID: \"76f0dead-e62b-424c-a40e-6f82d52ba722\") " pod="openshift-network-operator/iptables-alerter-jzz4r" Apr 21 10:03:44.730833 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.730639 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a892f931-ea13-4698-b14e-4a1b739f586c-agent-certs\") pod \"konnectivity-agent-hk9xb\" (UID: \"a892f931-ea13-4698-b14e-4a1b739f586c\") " pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:03:44.731268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.731246 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-tmp\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.731468 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.731446 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-etc-tuned\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.739557 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:44.739414 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:44.739557 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:44.739438 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:44.739557 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:44.739451 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6dncx for pod openshift-network-diagnostics/network-check-target-jzrcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:44.739557 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:44.739506 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx podName:5f39fe1c-70f8-4445-8cfd-646cb496d498 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:45.239487561 +0000 UTC m=+3.084683050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6dncx" (UniqueName: "kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx") pod "network-check-target-jzrcz" (UID: "5f39fe1c-70f8-4445-8cfd-646cb496d498") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:44.740858 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.740800 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzfrd\" (UniqueName: \"kubernetes.io/projected/76f0dead-e62b-424c-a40e-6f82d52ba722-kube-api-access-rzfrd\") pod \"iptables-alerter-jzz4r\" (UID: \"76f0dead-e62b-424c-a40e-6f82d52ba722\") " pod="openshift-network-operator/iptables-alerter-jzz4r" Apr 21 10:03:44.741735 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.741712 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pvxt\" (UniqueName: \"kubernetes.io/projected/f5719a9a-0eff-48f5-b634-e4d0a7216828-kube-api-access-5pvxt\") pod \"node-ca-wgkxz\" (UID: \"f5719a9a-0eff-48f5-b634-e4d0a7216828\") " pod="openshift-image-registry/node-ca-wgkxz" Apr 21 10:03:44.741951 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.741930 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bphsl\" (UniqueName: \"kubernetes.io/projected/7db2250c-0db7-4d5b-8890-c0dcb9a1171d-kube-api-access-bphsl\") pod \"tuned-jj5fc\" (UID: \"7db2250c-0db7-4d5b-8890-c0dcb9a1171d\") " pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.742406 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.742388 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdzqg\" (UniqueName: \"kubernetes.io/projected/22eb8c40-ac2f-42ed-897c-2c7e11b8588c-kube-api-access-rdzqg\") pod \"node-resolver-s97qn\" (UID: \"22eb8c40-ac2f-42ed-897c-2c7e11b8588c\") " pod="openshift-dns/node-resolver-s97qn" Apr 21 10:03:44.742520 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.742498 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mznx\" (UniqueName: \"kubernetes.io/projected/1ba26a6f-298a-4d59-9fa5-4f65cc1729c9-kube-api-access-2mznx\") pod \"multus-additional-cni-plugins-rnvsp\" (UID: \"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9\") " pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.742754 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.742731 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwvnb\" (UniqueName: \"kubernetes.io/projected/082858cf-9465-402e-85f9-8bab81da87b1-kube-api-access-nwvnb\") pod \"aws-ebs-csi-driver-node-w672n\" (UID: \"082858cf-9465-402e-85f9-8bab81da87b1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.829572 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829546 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmljj\" (UniqueName: \"kubernetes.io/projected/f531049d-f18b-4c01-9df7-a6c394430f98-kube-api-access-hmljj\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.829572 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829575 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-node-log\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829591 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-hostroot\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829605 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-multus-conf-dir\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-cnibin\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829641 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-os-release\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829666 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-hostroot\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829679 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-systemd-units\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829688 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-multus-conf-dir\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829697 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-cnibin\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829713 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-run-systemd\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829716 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-systemd-units\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829728 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-os-release\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829742 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bdbce72-3342-4c5d-9e2f-6757d506d268-ovnkube-config\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829746 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-run-systemd\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.829783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829784 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7f6x\" (UniqueName: \"kubernetes.io/projected/2bdbce72-3342-4c5d-9e2f-6757d506d268-kube-api-access-c7f6x\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829731 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-node-log\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829816 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-run-ovn\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829843 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-etc-kubernetes\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829858 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-var-lib-openvswitch\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829867 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-run-ovn\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829874 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bdbce72-3342-4c5d-9e2f-6757d506d268-env-overrides\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829916 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-run-netns\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829937 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-var-lib-openvswitch\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829918 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-etc-kubernetes\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829942 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-cni-bin\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829960 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-run-netns\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829993 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f531049d-f18b-4c01-9df7-a6c394430f98-cni-binary-copy\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830036 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-var-lib-kubelet\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.829990 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-cni-bin\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830078 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-slash\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830102 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-var-lib-kubelet\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830108 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-var-lib-cni-bin\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.830410 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830142 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-slash\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830151 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-var-lib-cni-multus\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830192 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-run-openvswitch\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830196 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-var-lib-cni-bin\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830233 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-var-lib-cni-multus\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830255 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-run-openvswitch\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830263 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-cni-netd\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830269 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2bdbce72-3342-4c5d-9e2f-6757d506d268-env-overrides\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830294 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-cni-netd\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830295 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-run-k8s-cni-cncf-io\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830323 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-run-k8s-cni-cncf-io\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830328 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f531049d-f18b-4c01-9df7-a6c394430f98-multus-daemon-config\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830352 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-etc-openvswitch\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-log-socket\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830385 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-etc-openvswitch\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f531049d-f18b-4c01-9df7-a6c394430f98-cni-binary-copy\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830407 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bdbce72-3342-4c5d-9e2f-6757d506d268-ovnkube-script-lib\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830433 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-run-multus-certs\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831147 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830436 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-log-socket\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830460 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-kubelet\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830490 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-kubelet\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830464 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-host-run-multus-certs\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830512 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-run-netns\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830542 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830558 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-run-netns\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830572 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bdbce72-3342-4c5d-9e2f-6757d506d268-ovn-node-metrics-cert\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830596 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830598 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:44.830672 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkk8p\" (UniqueName: \"kubernetes.io/projected/dbb00fc1-1258-4254-a360-3c350554925b-kube-api-access-nkk8p\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830718 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f531049d-f18b-4c01-9df7-a6c394430f98-multus-daemon-config\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:44.830737 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs podName:dbb00fc1-1258-4254-a360-3c350554925b nodeName:}" failed. No retries permitted until 2026-04-21 10:03:45.330720162 +0000 UTC m=+3.175915648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs") pod "network-metrics-daemon-nwsw4" (UID: "dbb00fc1-1258-4254-a360-3c350554925b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830765 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-multus-cni-dir\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830792 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-system-cni-dir\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830816 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-multus-socket-dir-parent\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.831970 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830842 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-run-ovn-kubernetes\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.832818 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830874 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-multus-cni-dir\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.832818 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830911 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bdbce72-3342-4c5d-9e2f-6757d506d268-host-run-ovn-kubernetes\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.832818 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830958 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-system-cni-dir\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.832818 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.830993 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f531049d-f18b-4c01-9df7-a6c394430f98-multus-socket-dir-parent\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.832818 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.831234 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2bdbce72-3342-4c5d-9e2f-6757d506d268-ovnkube-config\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.832818 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.831390 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2bdbce72-3342-4c5d-9e2f-6757d506d268-ovnkube-script-lib\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.833095 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.832981 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2bdbce72-3342-4c5d-9e2f-6757d506d268-ovn-node-metrics-cert\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.837077 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.837052 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmljj\" (UniqueName: \"kubernetes.io/projected/f531049d-f18b-4c01-9df7-a6c394430f98-kube-api-access-hmljj\") pod \"multus-bnl68\" (UID: \"f531049d-f18b-4c01-9df7-a6c394430f98\") " pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.837351 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.837329 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7f6x\" (UniqueName: \"kubernetes.io/projected/2bdbce72-3342-4c5d-9e2f-6757d506d268-kube-api-access-c7f6x\") pod \"ovnkube-node-tqctk\" (UID: \"2bdbce72-3342-4c5d-9e2f-6757d506d268\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:44.838214 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.838197 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkk8p\" (UniqueName: \"kubernetes.io/projected/dbb00fc1-1258-4254-a360-3c350554925b-kube-api-access-nkk8p\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:44.915118 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.915096 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wgkxz" Apr 21 10:03:44.922945 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.922925 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:03:44.932265 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.932249 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s97qn" Apr 21 10:03:44.938766 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.938749 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" Apr 21 10:03:44.946275 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.946259 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" Apr 21 10:03:44.952798 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.952782 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" Apr 21 10:03:44.962308 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.962290 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jzz4r" Apr 21 10:03:44.967790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.967772 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bnl68" Apr 21 10:03:44.973398 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:44.973382 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:03:45.176713 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:45.176502 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22eb8c40_ac2f_42ed_897c_2c7e11b8588c.slice/crio-a4c7696553c65c83bcd75765e3fe7f509bb62b19b54b5860d54f32b4b9b874ec WatchSource:0}: Error finding container a4c7696553c65c83bcd75765e3fe7f509bb62b19b54b5860d54f32b4b9b874ec: Status 404 returned error can't find the container with id a4c7696553c65c83bcd75765e3fe7f509bb62b19b54b5860d54f32b4b9b874ec Apr 21 10:03:45.177776 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:45.177743 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f0dead_e62b_424c_a40e_6f82d52ba722.slice/crio-f63e8b0a05c5bd3ea72fad56cb5dd59147fbe0432085911d54dd5f0b6b7f4ced WatchSource:0}: Error finding container f63e8b0a05c5bd3ea72fad56cb5dd59147fbe0432085911d54dd5f0b6b7f4ced: Status 404 returned error can't find the container with id f63e8b0a05c5bd3ea72fad56cb5dd59147fbe0432085911d54dd5f0b6b7f4ced Apr 21 10:03:45.183272 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:45.183001 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod082858cf_9465_402e_85f9_8bab81da87b1.slice/crio-366258cb93fb8c19b4a793b6b6e69c7c1fb0a442b36bc87ec44c88c82a81e649 WatchSource:0}: Error finding container 366258cb93fb8c19b4a793b6b6e69c7c1fb0a442b36bc87ec44c88c82a81e649: Status 404 returned error can't find the container with id 366258cb93fb8c19b4a793b6b6e69c7c1fb0a442b36bc87ec44c88c82a81e649 Apr 21 10:03:45.183272 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:45.183200 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf531049d_f18b_4c01_9df7_a6c394430f98.slice/crio-9c216a3b6c458c5c72e79c27849b1e108cf5f8645226ec795f6c14208f87cde5 WatchSource:0}: Error finding container 9c216a3b6c458c5c72e79c27849b1e108cf5f8645226ec795f6c14208f87cde5: Status 404 returned error can't find the container with id 9c216a3b6c458c5c72e79c27849b1e108cf5f8645226ec795f6c14208f87cde5 Apr 21 10:03:45.184438 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:45.184375 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5719a9a_0eff_48f5_b634_e4d0a7216828.slice/crio-987613c337120ff607d1aaea66dcc65d1d51904d57727e09d6ee3969327bb5c6 WatchSource:0}: Error finding container 987613c337120ff607d1aaea66dcc65d1d51904d57727e09d6ee3969327bb5c6: Status 404 returned error can't find the container with id 987613c337120ff607d1aaea66dcc65d1d51904d57727e09d6ee3969327bb5c6 Apr 21 10:03:45.185807 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:03:45.185520 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bdbce72_3342_4c5d_9e2f_6757d506d268.slice/crio-57056f4feca65eb68b2d9d18e69f1e99599d803bc897888e9ad2c561b88664d6 WatchSource:0}: Error finding container 57056f4feca65eb68b2d9d18e69f1e99599d803bc897888e9ad2c561b88664d6: Status 404 returned error can't find the container with id 57056f4feca65eb68b2d9d18e69f1e99599d803bc897888e9ad2c561b88664d6 Apr 21 10:03:45.334805 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.334765 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dncx\" (UniqueName: \"kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx\") pod \"network-check-target-jzrcz\" (UID: \"5f39fe1c-70f8-4445-8cfd-646cb496d498\") " pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:45.334906 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.334826 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:45.334972 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:45.334945 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:45.335028 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:45.334997 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs podName:dbb00fc1-1258-4254-a360-3c350554925b nodeName:}" failed. No retries permitted until 2026-04-21 10:03:46.334978375 +0000 UTC m=+4.180173880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs") pod "network-metrics-daemon-nwsw4" (UID: "dbb00fc1-1258-4254-a360-3c350554925b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:45.335223 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:45.335202 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:45.335298 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:45.335228 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:45.335298 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:45.335241 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6dncx for pod openshift-network-diagnostics/network-check-target-jzrcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:45.335298 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:45.335293 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx podName:5f39fe1c-70f8-4445-8cfd-646cb496d498 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:46.335276853 +0000 UTC m=+4.180472351 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6dncx" (UniqueName: "kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx") pod "network-check-target-jzrcz" (UID: "5f39fe1c-70f8-4445-8cfd-646cb496d498") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:45.655604 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.655550 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:43 +0000 UTC" deadline="2027-12-19 21:19:02.428148076 +0000 UTC" Apr 21 10:03:45.655604 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.655585 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14579h15m16.772566773s" Apr 21 10:03:45.719508 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.719449 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" event={"ID":"7db2250c-0db7-4d5b-8890-c0dcb9a1171d","Type":"ContainerStarted","Data":"3fbc38fa09a5d237fc5b1f7738b726d9995c50d5f20edfa9505a525b426d3e1c"} Apr 21 10:03:45.724412 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.724351 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" event={"ID":"2bdbce72-3342-4c5d-9e2f-6757d506d268","Type":"ContainerStarted","Data":"57056f4feca65eb68b2d9d18e69f1e99599d803bc897888e9ad2c561b88664d6"} Apr 21 10:03:45.727565 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.727516 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" event={"ID":"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9","Type":"ContainerStarted","Data":"73a7a17b7085cc5fcc4f40a2eaaf5767d61e00386bf14dbb77b82bb2c5ad6542"} Apr 21 10:03:45.740402 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.740290 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jzz4r" event={"ID":"76f0dead-e62b-424c-a40e-6f82d52ba722","Type":"ContainerStarted","Data":"f63e8b0a05c5bd3ea72fad56cb5dd59147fbe0432085911d54dd5f0b6b7f4ced"} Apr 21 10:03:45.758307 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.758252 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hk9xb" event={"ID":"a892f931-ea13-4698-b14e-4a1b739f586c","Type":"ContainerStarted","Data":"4a9975e5161fa190b956672bd234e6d995472ce263589d24fbda83dea8fcb14a"} Apr 21 10:03:45.760904 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.760873 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wgkxz" event={"ID":"f5719a9a-0eff-48f5-b634-e4d0a7216828","Type":"ContainerStarted","Data":"987613c337120ff607d1aaea66dcc65d1d51904d57727e09d6ee3969327bb5c6"} Apr 21 10:03:45.767319 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.767294 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" event={"ID":"082858cf-9465-402e-85f9-8bab81da87b1","Type":"ContainerStarted","Data":"366258cb93fb8c19b4a793b6b6e69c7c1fb0a442b36bc87ec44c88c82a81e649"} Apr 21 10:03:45.769466 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.769440 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s97qn" event={"ID":"22eb8c40-ac2f-42ed-897c-2c7e11b8588c","Type":"ContainerStarted","Data":"a4c7696553c65c83bcd75765e3fe7f509bb62b19b54b5860d54f32b4b9b874ec"} Apr 21 10:03:45.780978 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.780955 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-231.ec2.internal" event={"ID":"3d4599cdcd53cd71a013b6e6939ddb90","Type":"ContainerStarted","Data":"0061d82c808d4c316b56f2038f5ddeb58b349d7faa7baaeb980b80424c69f3ba"} Apr 21 10:03:45.802333 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:45.802290 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bnl68" event={"ID":"f531049d-f18b-4c01-9df7-a6c394430f98","Type":"ContainerStarted","Data":"9c216a3b6c458c5c72e79c27849b1e108cf5f8645226ec795f6c14208f87cde5"} Apr 21 10:03:46.344349 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:46.344268 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dncx\" (UniqueName: \"kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx\") pod \"network-check-target-jzrcz\" (UID: \"5f39fe1c-70f8-4445-8cfd-646cb496d498\") " pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:46.344349 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:46.344341 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:46.344567 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:46.344483 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:46.344567 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:46.344550 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs podName:dbb00fc1-1258-4254-a360-3c350554925b nodeName:}" failed. No retries permitted until 2026-04-21 10:03:48.34453192 +0000 UTC m=+6.189727409 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs") pod "network-metrics-daemon-nwsw4" (UID: "dbb00fc1-1258-4254-a360-3c350554925b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:46.344992 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:46.344973 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:46.345077 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:46.344996 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:46.345077 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:46.345008 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6dncx for pod openshift-network-diagnostics/network-check-target-jzrcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:46.345077 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:46.345050 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx podName:5f39fe1c-70f8-4445-8cfd-646cb496d498 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:48.345035922 +0000 UTC m=+6.190231411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6dncx" (UniqueName: "kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx") pod "network-check-target-jzrcz" (UID: "5f39fe1c-70f8-4445-8cfd-646cb496d498") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:46.527126 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:46.527099 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:46.713314 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:46.712604 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:46.713314 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:46.712728 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:03:46.713314 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:46.713149 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:46.713314 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:46.713250 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:03:46.814001 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:46.813967 2567 generic.go:358] "Generic (PLEG): container finished" podID="c0bc0d1d980f249ac792c6594624c04d" containerID="5d2e4ff7a90e596f9d77b76e9a50deb6178a8d793be7f537dd183997c88ffa37" exitCode=0 Apr 21 10:03:46.814813 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:46.814786 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" event={"ID":"c0bc0d1d980f249ac792c6594624c04d","Type":"ContainerDied","Data":"5d2e4ff7a90e596f9d77b76e9a50deb6178a8d793be7f537dd183997c88ffa37"} Apr 21 10:03:46.829503 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:46.828643 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-231.ec2.internal" podStartSLOduration=2.828626202 podStartE2EDuration="2.828626202s" podCreationTimestamp="2026-04-21 10:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:03:45.795717456 +0000 UTC m=+3.640912965" watchObservedRunningTime="2026-04-21 10:03:46.828626202 +0000 UTC m=+4.673821708" Apr 21 10:03:47.818724 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:47.818676 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" event={"ID":"c0bc0d1d980f249ac792c6594624c04d","Type":"ContainerStarted","Data":"c8a547dfac9e34e4cb4907aee97cf1ae2eb083df6af1a8fa8ec850b519a406e0"} Apr 21 10:03:47.846709 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:47.846659 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-231.ec2.internal" podStartSLOduration=3.846604243 podStartE2EDuration="3.846604243s" podCreationTimestamp="2026-04-21 10:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:03:47.844439874 +0000 UTC m=+5.689635374" watchObservedRunningTime="2026-04-21 10:03:47.846604243 +0000 UTC m=+5.691799749" Apr 21 10:03:48.359972 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:48.359932 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:48.360139 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:48.360014 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dncx\" (UniqueName: \"kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx\") pod \"network-check-target-jzrcz\" (UID: \"5f39fe1c-70f8-4445-8cfd-646cb496d498\") " pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:48.360226 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:48.360152 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:48.360226 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:48.360184 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:48.360226 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:48.360197 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6dncx for pod openshift-network-diagnostics/network-check-target-jzrcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:48.360390 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:48.360254 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx podName:5f39fe1c-70f8-4445-8cfd-646cb496d498 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:52.360236849 +0000 UTC m=+10.205432339 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6dncx" (UniqueName: "kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx") pod "network-check-target-jzrcz" (UID: "5f39fe1c-70f8-4445-8cfd-646cb496d498") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:48.360716 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:48.360695 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:48.360784 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:48.360764 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs podName:dbb00fc1-1258-4254-a360-3c350554925b nodeName:}" failed. No retries permitted until 2026-04-21 10:03:52.360740168 +0000 UTC m=+10.205935666 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs") pod "network-metrics-daemon-nwsw4" (UID: "dbb00fc1-1258-4254-a360-3c350554925b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:48.709969 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:48.709657 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:48.709969 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:48.709787 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:03:48.709969 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:48.709830 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:48.709969 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:48.709952 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:03:50.710346 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:50.710312 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:50.710775 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:50.710436 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:03:50.710861 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:50.710840 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:50.710991 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:50.710960 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:03:51.324229 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:51.324192 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mg976"] Apr 21 10:03:51.327072 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:51.327033 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:51.327203 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:51.327110 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:03:51.382154 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:51.382122 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:51.382319 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:51.382256 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-dbus\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:51.382383 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:51.382321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-kubelet-config\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:51.483542 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:51.483511 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-kubelet-config\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:51.483690 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:51.483583 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:51.483690 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:51.483624 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-dbus\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:51.483807 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:51.483791 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-dbus\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:51.483863 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:51.483854 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-kubelet-config\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:51.483962 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:51.483947 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:51.484030 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:51.484007 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret podName:79560f0b-3eeb-4ad8-9eca-ac25ba5bf424 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:51.983989349 +0000 UTC m=+9.829184839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret") pod "global-pull-secret-syncer-mg976" (UID: "79560f0b-3eeb-4ad8-9eca-ac25ba5bf424") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:51.986537 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:51.986477 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:51.986988 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:51.986647 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:51.986988 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:51.986733 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret podName:79560f0b-3eeb-4ad8-9eca-ac25ba5bf424 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:52.986713282 +0000 UTC m=+10.831908767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret") pod "global-pull-secret-syncer-mg976" (UID: "79560f0b-3eeb-4ad8-9eca-ac25ba5bf424") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:52.390328 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:52.389895 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dncx\" (UniqueName: \"kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx\") pod \"network-check-target-jzrcz\" (UID: \"5f39fe1c-70f8-4445-8cfd-646cb496d498\") " pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:52.390328 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:52.389972 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:52.390328 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:52.390079 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:52.390328 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:52.390104 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:52.390328 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:52.390113 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:52.390328 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:52.390117 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6dncx for pod openshift-network-diagnostics/network-check-target-jzrcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:52.390328 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:52.390197 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs podName:dbb00fc1-1258-4254-a360-3c350554925b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:00.390159572 +0000 UTC m=+18.235355061 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs") pod "network-metrics-daemon-nwsw4" (UID: "dbb00fc1-1258-4254-a360-3c350554925b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:52.390328 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:52.390243 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx podName:5f39fe1c-70f8-4445-8cfd-646cb496d498 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:00.390223893 +0000 UTC m=+18.235419392 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6dncx" (UniqueName: "kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx") pod "network-check-target-jzrcz" (UID: "5f39fe1c-70f8-4445-8cfd-646cb496d498") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:52.711354 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:52.710819 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:52.711354 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:52.710880 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:52.711354 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:52.710903 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:52.711354 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:52.710985 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:03:52.711354 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:52.711097 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:03:52.711354 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:52.711198 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:03:52.995424 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:52.994930 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:52.995424 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:52.995104 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:52.995424 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:52.995162 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret podName:79560f0b-3eeb-4ad8-9eca-ac25ba5bf424 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:54.995142828 +0000 UTC m=+12.840338316 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret") pod "global-pull-secret-syncer-mg976" (UID: "79560f0b-3eeb-4ad8-9eca-ac25ba5bf424") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:54.710582 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:54.710545 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:54.711021 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:54.710654 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:03:54.711021 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:54.710668 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:54.711021 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:54.710548 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:54.711021 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:54.710759 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:03:54.711021 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:54.710910 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:03:55.009189 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:55.009109 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:55.009316 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:55.009247 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:55.009316 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:55.009300 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret podName:79560f0b-3eeb-4ad8-9eca-ac25ba5bf424 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:59.009287165 +0000 UTC m=+16.854482649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret") pod "global-pull-secret-syncer-mg976" (UID: "79560f0b-3eeb-4ad8-9eca-ac25ba5bf424") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:56.710630 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:56.710595 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:56.711046 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:56.710595 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:56.711046 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:56.710720 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:03:56.711046 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:56.710595 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:56.711046 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:56.710810 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:03:56.711046 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:56.710905 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:03:58.710599 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:58.710564 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:58.710599 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:58.710586 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:03:58.711087 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:58.710564 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:03:58.711087 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:58.710685 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:03:58.711087 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:58.710756 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:03:58.711087 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:58.710855 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:03:59.044520 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:03:59.044432 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:03:59.044678 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:59.044588 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:59.044678 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:03:59.044663 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret podName:79560f0b-3eeb-4ad8-9eca-ac25ba5bf424 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:07.044640032 +0000 UTC m=+24.889835531 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret") pod "global-pull-secret-syncer-mg976" (UID: "79560f0b-3eeb-4ad8-9eca-ac25ba5bf424") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:00.456038 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:00.456000 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dncx\" (UniqueName: \"kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx\") pod \"network-check-target-jzrcz\" (UID: \"5f39fe1c-70f8-4445-8cfd-646cb496d498\") " pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:00.456509 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:00.456078 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:00.456509 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:00.456202 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:00.456509 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:00.456206 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:00.456509 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:00.456232 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:00.456509 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:00.456246 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6dncx for pod openshift-network-diagnostics/network-check-target-jzrcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:00.456509 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:00.456268 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs podName:dbb00fc1-1258-4254-a360-3c350554925b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:16.456248165 +0000 UTC m=+34.301443660 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs") pod "network-metrics-daemon-nwsw4" (UID: "dbb00fc1-1258-4254-a360-3c350554925b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:00.456509 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:00.456296 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx podName:5f39fe1c-70f8-4445-8cfd-646cb496d498 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:16.456279768 +0000 UTC m=+34.301475260 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6dncx" (UniqueName: "kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx") pod "network-check-target-jzrcz" (UID: "5f39fe1c-70f8-4445-8cfd-646cb496d498") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:00.710723 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:00.710654 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:00.710863 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:00.710657 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:00.710863 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:00.710792 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:00.710863 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:00.710794 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:04:00.710983 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:00.710905 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:04:00.711032 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:00.710986 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:04:02.710986 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.710758 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:02.711666 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.710828 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:02.711666 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.710850 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:02.711666 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:02.711145 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:04:02.711666 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:02.711107 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:04:02.711666 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:02.711254 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:04:02.841875 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.841763 2567 generic.go:358] "Generic (PLEG): container finished" podID="1ba26a6f-298a-4d59-9fa5-4f65cc1729c9" containerID="e0292df8e1cf6d111d142a359d4810461de0ec62d58861379b158b6b31c05900" exitCode=0 Apr 21 10:04:02.841985 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.841845 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" event={"ID":"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9","Type":"ContainerDied","Data":"e0292df8e1cf6d111d142a359d4810461de0ec62d58861379b158b6b31c05900"} Apr 21 10:04:02.843207 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.843164 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hk9xb" event={"ID":"a892f931-ea13-4698-b14e-4a1b739f586c","Type":"ContainerStarted","Data":"2454a42e490b9bd44d4c0ae99337f319ae82159c85fc8ee37fea5f6f3cf3b0e8"} Apr 21 10:04:02.844483 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.844447 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wgkxz" event={"ID":"f5719a9a-0eff-48f5-b634-e4d0a7216828","Type":"ContainerStarted","Data":"543ba3c7b8549310f4bdd1f7ab0e563a82c9dd3578e9cf99015c2fbb69569410"} Apr 21 10:04:02.845793 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.845771 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" event={"ID":"082858cf-9465-402e-85f9-8bab81da87b1","Type":"ContainerStarted","Data":"524ff5953632099ef5cd381d90052e695efccc1667b2051a97fe5e052dbd259a"} Apr 21 10:04:02.846905 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.846883 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s97qn" event={"ID":"22eb8c40-ac2f-42ed-897c-2c7e11b8588c","Type":"ContainerStarted","Data":"eb8e4d947f621a8df3ea2203b76ff73f5cd96bf66e43eeb813f04ba538eb4685"} Apr 21 10:04:02.848116 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.848099 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bnl68" event={"ID":"f531049d-f18b-4c01-9df7-a6c394430f98","Type":"ContainerStarted","Data":"bcca8699c0275f4851ad4d6cf96dfcb143aee69578648a69262bc6b9b0fd1336"} Apr 21 10:04:02.849433 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.849413 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" event={"ID":"7db2250c-0db7-4d5b-8890-c0dcb9a1171d","Type":"ContainerStarted","Data":"2e4370dc14ea354f402c6ee617864cdd861ab82340016e41636f664d8e01962c"} Apr 21 10:04:02.851073 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.851055 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:04:02.851345 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.851331 2567 generic.go:358] "Generic (PLEG): container finished" podID="2bdbce72-3342-4c5d-9e2f-6757d506d268" containerID="7364692ad3d3f03dd52ca5c87ba7d5d1faa1cde588c6d2c18fb8d4643b825cce" exitCode=1 Apr 21 10:04:02.851392 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.851354 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" event={"ID":"2bdbce72-3342-4c5d-9e2f-6757d506d268","Type":"ContainerStarted","Data":"c49480d766f3ec84d9e5f9c1acf8273f1c1c32bc7883697c3bd84f8eaa71d03c"} Apr 21 10:04:02.851392 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.851368 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" event={"ID":"2bdbce72-3342-4c5d-9e2f-6757d506d268","Type":"ContainerStarted","Data":"dd358866ec3bcb78013f51ea225e6ddd7371c4a80bcd14b98f6421a3c823c9b7"} Apr 21 10:04:02.851392 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.851377 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" event={"ID":"2bdbce72-3342-4c5d-9e2f-6757d506d268","Type":"ContainerDied","Data":"7364692ad3d3f03dd52ca5c87ba7d5d1faa1cde588c6d2c18fb8d4643b825cce"} Apr 21 10:04:02.851392 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.851385 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" event={"ID":"2bdbce72-3342-4c5d-9e2f-6757d506d268","Type":"ContainerStarted","Data":"bba185d347d97eb01516a2b8acec47a014db1ce9bef59d9d28b6348f0e5136d6"} Apr 21 10:04:02.871566 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.871523 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wgkxz" podStartSLOduration=3.9618331060000003 podStartE2EDuration="20.871507985s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:45.191086529 +0000 UTC m=+3.036282028" lastFinishedPulling="2026-04-21 10:04:02.100761406 +0000 UTC m=+19.945956907" observedRunningTime="2026-04-21 10:04:02.870564964 +0000 UTC m=+20.715760471" watchObservedRunningTime="2026-04-21 10:04:02.871507985 +0000 UTC m=+20.716703495" Apr 21 10:04:02.885198 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.885140 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s97qn" podStartSLOduration=3.996428969 podStartE2EDuration="20.885126273s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:45.178693884 +0000 UTC m=+3.023889378" lastFinishedPulling="2026-04-21 10:04:02.067391189 +0000 UTC m=+19.912586682" observedRunningTime="2026-04-21 10:04:02.884787443 +0000 UTC m=+20.729982950" watchObservedRunningTime="2026-04-21 10:04:02.885126273 +0000 UTC m=+20.730321781" Apr 21 10:04:02.900704 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.900673 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jj5fc" podStartSLOduration=3.987972978 podStartE2EDuration="20.90064991s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:45.190185123 +0000 UTC m=+3.035380618" lastFinishedPulling="2026-04-21 10:04:02.102862054 +0000 UTC m=+19.948057550" observedRunningTime="2026-04-21 10:04:02.900359665 +0000 UTC m=+20.745555179" watchObservedRunningTime="2026-04-21 10:04:02.90064991 +0000 UTC m=+20.745845416" Apr 21 10:04:02.915944 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.915906 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-hk9xb" podStartSLOduration=12.115562383 podStartE2EDuration="20.915893063s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:45.191521266 +0000 UTC m=+3.036716757" lastFinishedPulling="2026-04-21 10:03:53.991851937 +0000 UTC m=+11.837047437" observedRunningTime="2026-04-21 10:04:02.915692397 +0000 UTC m=+20.760887905" watchObservedRunningTime="2026-04-21 10:04:02.915893063 +0000 UTC m=+20.761088570" Apr 21 10:04:02.937770 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:02.937732 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bnl68" podStartSLOduration=3.017747492 podStartE2EDuration="19.937719089s" podCreationTimestamp="2026-04-21 10:03:43 +0000 UTC" firstStartedPulling="2026-04-21 10:03:45.184973202 +0000 UTC m=+3.030168698" lastFinishedPulling="2026-04-21 10:04:02.104944802 +0000 UTC m=+19.950140295" observedRunningTime="2026-04-21 10:04:02.936660864 +0000 UTC m=+20.781856361" watchObservedRunningTime="2026-04-21 10:04:02.937719089 +0000 UTC m=+20.782914613" Apr 21 10:04:03.815675 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:03.815646 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 10:04:03.854302 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:03.854272 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jzz4r" event={"ID":"76f0dead-e62b-424c-a40e-6f82d52ba722","Type":"ContainerStarted","Data":"468ca6aeef848bc76d3498d160e7fd45d3b185f9b48e59f26cf038c842c3ef2c"} Apr 21 10:04:03.856218 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:03.856191 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" event={"ID":"082858cf-9465-402e-85f9-8bab81da87b1","Type":"ContainerStarted","Data":"b764050ea9f02c54d03b4fea113479b433558e18446227ae7927f3fadad55300"} Apr 21 10:04:03.858781 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:03.858764 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:04:03.859140 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:03.859116 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" event={"ID":"2bdbce72-3342-4c5d-9e2f-6757d506d268","Type":"ContainerStarted","Data":"4c2dd8856c4bfee9c4097f8865484e10e6b6d03d438c35515fb8a9564b207f3a"} Apr 21 10:04:03.859229 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:03.859144 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" event={"ID":"2bdbce72-3342-4c5d-9e2f-6757d506d268","Type":"ContainerStarted","Data":"c6116f8096e231f106e4dcf18e1a6b282b9149ba9e526fb6633fe4dbe9f09bcd"} Apr 21 10:04:03.868113 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:03.868037 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jzz4r" podStartSLOduration=3.946746323 podStartE2EDuration="20.868022558s" podCreationTimestamp="2026-04-21 10:03:43 +0000 UTC" firstStartedPulling="2026-04-21 10:03:45.179571633 +0000 UTC m=+3.024767132" lastFinishedPulling="2026-04-21 10:04:02.100847872 +0000 UTC m=+19.946043367" observedRunningTime="2026-04-21 10:04:03.867780593 +0000 UTC m=+21.712976103" watchObservedRunningTime="2026-04-21 10:04:03.868022558 +0000 UTC m=+21.713218065" Apr 21 10:04:04.686736 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:04.686590 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T10:04:03.815671274Z","UUID":"cb7ed229-60ed-47cb-8333-61cb66038890","Handler":null,"Name":"","Endpoint":""} Apr 21 10:04:04.689073 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:04.688870 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 10:04:04.689208 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:04.689082 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 10:04:04.710371 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:04.710339 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:04.710480 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:04.710461 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:04.710480 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:04.710478 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:04.710573 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:04.710455 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:04:04.710605 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:04.710564 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:04:04.710648 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:04.710630 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:04:05.036156 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:05.036092 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:04:05.036729 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:05.036708 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:04:05.867440 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:05.867403 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" event={"ID":"082858cf-9465-402e-85f9-8bab81da87b1","Type":"ContainerStarted","Data":"e4c49655bb6388e14c67c104fcaf7dfbfd5d492c757bd0d3f1a7871eed634b74"} Apr 21 10:04:05.870523 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:05.870498 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:04:05.870950 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:05.870928 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" event={"ID":"2bdbce72-3342-4c5d-9e2f-6757d506d268","Type":"ContainerStarted","Data":"30fa943905be3643be68dafbdabb61d19a48a15649b3b53782728465bb525b9d"} Apr 21 10:04:05.871362 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:05.871249 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:04:05.871720 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:05.871700 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-hk9xb" Apr 21 10:04:05.885500 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:05.885454 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w672n" podStartSLOduration=4.308685996 podStartE2EDuration="23.885439535s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:45.184644875 +0000 UTC m=+3.029840364" lastFinishedPulling="2026-04-21 10:04:04.761398405 +0000 UTC m=+22.606593903" observedRunningTime="2026-04-21 10:04:05.885377085 +0000 UTC m=+23.730572592" watchObservedRunningTime="2026-04-21 10:04:05.885439535 +0000 UTC m=+23.730635044" Apr 21 10:04:06.709682 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:06.709653 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:06.710456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:06.709653 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:06.710456 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:06.709768 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:04:06.710456 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:06.709852 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:04:06.710456 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:06.709887 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:06.710456 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:06.709982 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:04:07.111268 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:07.111203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:07.111395 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:07.111312 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:07.111395 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:07.111357 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret podName:79560f0b-3eeb-4ad8-9eca-ac25ba5bf424 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:23.11134383 +0000 UTC m=+40.956539314 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret") pod "global-pull-secret-syncer-mg976" (UID: "79560f0b-3eeb-4ad8-9eca-ac25ba5bf424") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:07.876465 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:07.876291 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:04:07.876974 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:07.876778 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" event={"ID":"2bdbce72-3342-4c5d-9e2f-6757d506d268","Type":"ContainerStarted","Data":"9a6446a84b5c2e2139de6ed105c8b49ba1becfdcb44d1ff3f51fa01c0107d5a2"} Apr 21 10:04:07.877105 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:07.877085 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:04:07.877352 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:07.877332 2567 scope.go:117] "RemoveContainer" containerID="7364692ad3d3f03dd52ca5c87ba7d5d1faa1cde588c6d2c18fb8d4643b825cce" Apr 21 10:04:07.878835 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:07.878812 2567 generic.go:358] "Generic (PLEG): container finished" podID="1ba26a6f-298a-4d59-9fa5-4f65cc1729c9" containerID="ffb66658dccfe8bc719e12fb3a4506569e390ae73215aa9db2fe9c82378c0557" exitCode=0 Apr 21 10:04:07.878938 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:07.878899 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" event={"ID":"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9","Type":"ContainerDied","Data":"ffb66658dccfe8bc719e12fb3a4506569e390ae73215aa9db2fe9c82378c0557"} Apr 21 10:04:07.892025 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:07.892012 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:04:08.709964 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:08.709928 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:08.710156 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:08.710046 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:04:08.710156 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:08.709928 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:08.710156 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:08.709928 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:08.710336 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:08.710218 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:04:08.710336 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:08.710239 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:04:08.884251 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:08.884225 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:04:08.884598 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:08.884576 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" event={"ID":"2bdbce72-3342-4c5d-9e2f-6757d506d268","Type":"ContainerStarted","Data":"5287dc9782c22573c630590b4f51d17c465aa63718956f6e86a3805a832f92fb"} Apr 21 10:04:08.884976 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:08.884964 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:04:08.885055 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:08.885045 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:04:08.897989 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:08.897967 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:04:08.912707 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:08.912668 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" podStartSLOduration=8.944536185 podStartE2EDuration="25.912656688s" podCreationTimestamp="2026-04-21 10:03:43 +0000 UTC" firstStartedPulling="2026-04-21 10:03:45.190967855 +0000 UTC m=+3.036163345" lastFinishedPulling="2026-04-21 10:04:02.159088362 +0000 UTC m=+20.004283848" observedRunningTime="2026-04-21 10:04:08.911114822 +0000 UTC m=+26.756310341" watchObservedRunningTime="2026-04-21 10:04:08.912656688 +0000 UTC m=+26.757852194" Apr 21 10:04:09.304513 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:09.304447 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jzrcz"] Apr 21 10:04:09.304615 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:09.304587 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:09.304732 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:09.304709 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:04:09.307255 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:09.307231 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mg976"] Apr 21 10:04:09.307344 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:09.307332 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:09.307432 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:09.307412 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:04:09.307983 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:09.307964 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nwsw4"] Apr 21 10:04:09.308067 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:09.308055 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:09.308180 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:09.308143 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:04:09.888226 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:09.888191 2567 generic.go:358] "Generic (PLEG): container finished" podID="1ba26a6f-298a-4d59-9fa5-4f65cc1729c9" containerID="a222a72cc7d4686572b9f0d443e65c3ac1ca6de2a4ec2c213232ac139ee67618" exitCode=0 Apr 21 10:04:09.888669 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:09.888277 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" event={"ID":"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9","Type":"ContainerDied","Data":"a222a72cc7d4686572b9f0d443e65c3ac1ca6de2a4ec2c213232ac139ee67618"} Apr 21 10:04:10.709705 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:10.709674 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:10.709854 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:10.709674 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:10.709854 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:10.709779 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:04:10.709922 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:10.709846 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:04:11.710520 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:11.710498 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:11.710833 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:11.710585 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:04:11.893586 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:11.893545 2567 generic.go:358] "Generic (PLEG): container finished" podID="1ba26a6f-298a-4d59-9fa5-4f65cc1729c9" containerID="f583c6afe7ef91e89a33248e20b8f3d0991b85ca69013c7b88f55c0a6696c428" exitCode=0 Apr 21 10:04:11.893739 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:11.893625 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" event={"ID":"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9","Type":"ContainerDied","Data":"f583c6afe7ef91e89a33248e20b8f3d0991b85ca69013c7b88f55c0a6696c428"} Apr 21 10:04:12.710710 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:12.710541 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:12.711249 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:12.710595 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:12.711249 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:12.710793 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:04:12.711249 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:12.710885 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:04:13.710714 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:13.710689 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:13.711132 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:13.710786 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:04:14.710450 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:14.710418 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:14.710604 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:14.710418 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:14.710604 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:14.710537 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:04:14.710715 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:14.710630 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mg976" podUID="79560f0b-3eeb-4ad8-9eca-ac25ba5bf424" Apr 21 10:04:15.710209 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:15.710177 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:15.710808 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:15.710292 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jzrcz" podUID="5f39fe1c-70f8-4445-8cfd-646cb496d498" Apr 21 10:04:15.950110 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:15.950085 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-231.ec2.internal" event="NodeReady" Apr 21 10:04:15.950267 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:15.950223 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 10:04:15.997939 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:15.997865 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8477bf8587-zfnhn"] Apr 21 10:04:16.028553 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.027407 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sznwb"] Apr 21 10:04:16.028553 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.027822 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.032423 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.032400 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 10:04:16.032423 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.032419 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 10:04:16.032944 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.032927 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nljzt\"" Apr 21 10:04:16.033324 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.033305 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 10:04:16.041681 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.041660 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 10:04:16.044013 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.043990 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8477bf8587-zfnhn"] Apr 21 10:04:16.044117 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.044022 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-v9jx5"] Apr 21 10:04:16.044204 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.044157 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.046657 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.046638 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 10:04:16.046792 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.046759 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 10:04:16.046792 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.046778 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-g7j6z\"" Apr 21 10:04:16.064482 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.064450 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sznwb"] Apr 21 10:04:16.064584 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.064488 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v9jx5"] Apr 21 10:04:16.064672 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.064593 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:16.066777 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.066741 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 10:04:16.066949 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.066933 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 10:04:16.067023 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.066966 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 10:04:16.067077 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.067033 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9st8k\"" Apr 21 10:04:16.181936 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.181897 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a413cb28-d70b-44b6-a527-03a5247fa66a-config-volume\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.182095 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.181939 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a413cb28-d70b-44b6-a527-03a5247fa66a-tmp-dir\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.182095 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.181973 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c2s4\" (UniqueName: \"kubernetes.io/projected/a413cb28-d70b-44b6-a527-03a5247fa66a-kube-api-access-9c2s4\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.182095 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.182001 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-trusted-ca\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.182095 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.182030 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trw66\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-kube-api-access-trw66\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.182095 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.182054 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:16.182095 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.182082 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.182095 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.182097 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-certificates\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.182446 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.182113 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-bound-sa-token\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.182446 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.182137 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-image-registry-private-configuration\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.182446 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.182164 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-ca-trust-extracted\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.182446 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.182245 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6hvq\" (UniqueName: \"kubernetes.io/projected/2857b675-4470-427f-a3d7-94390418dee9-kube-api-access-n6hvq\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:16.182446 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.182312 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.182446 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.182349 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-installation-pull-secrets\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.283244 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283162 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a413cb28-d70b-44b6-a527-03a5247fa66a-config-volume\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.283244 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283223 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a413cb28-d70b-44b6-a527-03a5247fa66a-tmp-dir\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.283481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283252 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c2s4\" (UniqueName: \"kubernetes.io/projected/a413cb28-d70b-44b6-a527-03a5247fa66a-kube-api-access-9c2s4\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.283481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283278 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-trusted-ca\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.283481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283304 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trw66\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-kube-api-access-trw66\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.283481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283330 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:16.283481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283368 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.283481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283447 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-certificates\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.283773 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283500 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-bound-sa-token\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.283773 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-image-registry-private-configuration\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.283773 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283560 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-ca-trust-extracted\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.283773 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283591 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6hvq\" (UniqueName: \"kubernetes.io/projected/2857b675-4470-427f-a3d7-94390418dee9-kube-api-access-n6hvq\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:16.283773 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283619 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.283773 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283639 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-installation-pull-secrets\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.283773 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283756 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a413cb28-d70b-44b6-a527-03a5247fa66a-config-volume\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.284100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.283986 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a413cb28-d70b-44b6-a527-03a5247fa66a-tmp-dir\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.284100 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.283458 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:16.284227 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.284109 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:16.284227 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.284123 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8477bf8587-zfnhn: secret "image-registry-tls" not found Apr 21 10:04:16.284227 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.283613 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:16.284227 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.284184 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls podName:a413cb28-d70b-44b6-a527-03a5247fa66a nodeName:}" failed. No retries permitted until 2026-04-21 10:04:16.784148696 +0000 UTC m=+34.629344198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls") pod "dns-default-sznwb" (UID: "a413cb28-d70b-44b6-a527-03a5247fa66a") : secret "dns-default-metrics-tls" not found Apr 21 10:04:16.284227 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.284205 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls podName:dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:16.784194338 +0000 UTC m=+34.629389824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls") pod "image-registry-8477bf8587-zfnhn" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55") : secret "image-registry-tls" not found Apr 21 10:04:16.284227 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.284221 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert podName:2857b675-4470-427f-a3d7-94390418dee9 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:16.784211738 +0000 UTC m=+34.629407238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert") pod "ingress-canary-v9jx5" (UID: "2857b675-4470-427f-a3d7-94390418dee9") : secret "canary-serving-cert" not found Apr 21 10:04:16.284543 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.284450 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-ca-trust-extracted\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.284937 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.284893 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-certificates\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.285227 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.285205 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-trusted-ca\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.288364 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.288343 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-installation-pull-secrets\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.288453 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.288380 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-image-registry-private-configuration\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.293100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.292980 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trw66\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-kube-api-access-trw66\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.295479 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.295454 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-bound-sa-token\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.295616 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.295598 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c2s4\" (UniqueName: \"kubernetes.io/projected/a413cb28-d70b-44b6-a527-03a5247fa66a-kube-api-access-9c2s4\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.296203 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.296156 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6hvq\" (UniqueName: \"kubernetes.io/projected/2857b675-4470-427f-a3d7-94390418dee9-kube-api-access-n6hvq\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:16.486234 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.486201 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:16.486398 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.486277 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dncx\" (UniqueName: \"kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx\") pod \"network-check-target-jzrcz\" (UID: \"5f39fe1c-70f8-4445-8cfd-646cb496d498\") " pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:16.486398 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.486356 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:16.486398 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.486381 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:16.486398 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.486395 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:16.486605 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.486404 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6dncx for pod openshift-network-diagnostics/network-check-target-jzrcz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:16.486605 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.486433 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs podName:dbb00fc1-1258-4254-a360-3c350554925b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:48.486417487 +0000 UTC m=+66.331612973 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs") pod "network-metrics-daemon-nwsw4" (UID: "dbb00fc1-1258-4254-a360-3c350554925b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:16.486605 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.486449 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx podName:5f39fe1c-70f8-4445-8cfd-646cb496d498 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:48.4864413 +0000 UTC m=+66.331636783 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6dncx" (UniqueName: "kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx") pod "network-check-target-jzrcz" (UID: "5f39fe1c-70f8-4445-8cfd-646cb496d498") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:16.710443 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.710413 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:16.711162 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.710591 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:16.712991 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.712970 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:04:16.713126 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.712976 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 10:04:16.713126 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.712978 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7zbjr\"" Apr 21 10:04:16.788549 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.788522 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:16.788684 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.788563 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:16.788684 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:16.788596 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:16.788803 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.788687 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:16.788803 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.788748 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls podName:a413cb28-d70b-44b6-a527-03a5247fa66a nodeName:}" failed. No retries permitted until 2026-04-21 10:04:17.788729247 +0000 UTC m=+35.633924734 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls") pod "dns-default-sznwb" (UID: "a413cb28-d70b-44b6-a527-03a5247fa66a") : secret "dns-default-metrics-tls" not found Apr 21 10:04:16.788803 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.788761 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:16.788803 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.788782 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8477bf8587-zfnhn: secret "image-registry-tls" not found Apr 21 10:04:16.789005 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.788832 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls podName:dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:17.78881774 +0000 UTC m=+35.634013225 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls") pod "image-registry-8477bf8587-zfnhn" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55") : secret "image-registry-tls" not found Apr 21 10:04:16.789005 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.788690 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:16.789005 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:16.788899 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert podName:2857b675-4470-427f-a3d7-94390418dee9 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:17.788878795 +0000 UTC m=+35.634074281 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert") pod "ingress-canary-v9jx5" (UID: "2857b675-4470-427f-a3d7-94390418dee9") : secret "canary-serving-cert" not found Apr 21 10:04:17.710475 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:17.710440 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:17.713122 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:17.713099 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:04:17.714049 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:17.714033 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bjzzq\"" Apr 21 10:04:17.714136 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:17.714066 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:04:17.794888 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:17.794859 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:17.794999 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:17.794911 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:17.794999 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:17.794955 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:17.795110 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:17.795009 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:17.795110 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:17.795072 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:17.795110 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:17.795088 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8477bf8587-zfnhn: secret "image-registry-tls" not found Apr 21 10:04:17.795280 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:17.795076 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert podName:2857b675-4470-427f-a3d7-94390418dee9 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:19.795056191 +0000 UTC m=+37.640251696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert") pod "ingress-canary-v9jx5" (UID: "2857b675-4470-427f-a3d7-94390418dee9") : secret "canary-serving-cert" not found Apr 21 10:04:17.795280 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:17.795134 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls podName:dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:19.79512317 +0000 UTC m=+37.640318653 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls") pod "image-registry-8477bf8587-zfnhn" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55") : secret "image-registry-tls" not found Apr 21 10:04:17.795280 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:17.795138 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:17.795280 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:17.795219 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls podName:a413cb28-d70b-44b6-a527-03a5247fa66a nodeName:}" failed. No retries permitted until 2026-04-21 10:04:19.795202036 +0000 UTC m=+37.640397523 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls") pod "dns-default-sznwb" (UID: "a413cb28-d70b-44b6-a527-03a5247fa66a") : secret "dns-default-metrics-tls" not found Apr 21 10:04:17.907491 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:17.907460 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" event={"ID":"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9","Type":"ContainerStarted","Data":"9a5638605d4dd9af9df3c095f83ef93feea35f4c59f69a86638ba4217c4c195f"} Apr 21 10:04:18.911795 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:18.911596 2567 generic.go:358] "Generic (PLEG): container finished" podID="1ba26a6f-298a-4d59-9fa5-4f65cc1729c9" containerID="9a5638605d4dd9af9df3c095f83ef93feea35f4c59f69a86638ba4217c4c195f" exitCode=0 Apr 21 10:04:18.912113 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:18.911672 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" event={"ID":"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9","Type":"ContainerDied","Data":"9a5638605d4dd9af9df3c095f83ef93feea35f4c59f69a86638ba4217c4c195f"} Apr 21 10:04:19.808883 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:19.808845 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:19.809038 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:19.808919 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:19.809038 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:19.808944 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:19.809038 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:19.808995 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:19.809038 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:19.809016 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8477bf8587-zfnhn: secret "image-registry-tls" not found Apr 21 10:04:19.809187 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:19.809044 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:19.809187 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:19.809052 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:19.809187 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:19.809068 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls podName:dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:23.809053115 +0000 UTC m=+41.654248599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls") pod "image-registry-8477bf8587-zfnhn" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55") : secret "image-registry-tls" not found Apr 21 10:04:19.809187 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:19.809084 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls podName:a413cb28-d70b-44b6-a527-03a5247fa66a nodeName:}" failed. No retries permitted until 2026-04-21 10:04:23.80907332 +0000 UTC m=+41.654268803 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls") pod "dns-default-sznwb" (UID: "a413cb28-d70b-44b6-a527-03a5247fa66a") : secret "dns-default-metrics-tls" not found Apr 21 10:04:19.809187 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:19.809097 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert podName:2857b675-4470-427f-a3d7-94390418dee9 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:23.809091303 +0000 UTC m=+41.654286787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert") pod "ingress-canary-v9jx5" (UID: "2857b675-4470-427f-a3d7-94390418dee9") : secret "canary-serving-cert" not found Apr 21 10:04:19.917216 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:19.917186 2567 generic.go:358] "Generic (PLEG): container finished" podID="1ba26a6f-298a-4d59-9fa5-4f65cc1729c9" containerID="47822e74c7bcac6f88368a0c68124468ae823c39112778c0495e48d3f89501be" exitCode=0 Apr 21 10:04:19.917564 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:19.917226 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" event={"ID":"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9","Type":"ContainerDied","Data":"47822e74c7bcac6f88368a0c68124468ae823c39112778c0495e48d3f89501be"} Apr 21 10:04:20.922277 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:20.922243 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" event={"ID":"1ba26a6f-298a-4d59-9fa5-4f65cc1729c9","Type":"ContainerStarted","Data":"6340bcc6f287f0e1d4f8e3375759fe11e4200d61fb95628124a68ca1f5447f19"} Apr 21 10:04:20.951692 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:20.951644 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rnvsp" podStartSLOduration=6.408702936 podStartE2EDuration="38.951630385s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:03:45.191011323 +0000 UTC m=+3.036206810" lastFinishedPulling="2026-04-21 10:04:17.733938776 +0000 UTC m=+35.579134259" observedRunningTime="2026-04-21 10:04:20.950584925 +0000 UTC m=+38.795780454" watchObservedRunningTime="2026-04-21 10:04:20.951630385 +0000 UTC m=+38.796825874" Apr 21 10:04:23.131953 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:23.131922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:23.135508 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:23.135481 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79560f0b-3eeb-4ad8-9eca-ac25ba5bf424-original-pull-secret\") pod \"global-pull-secret-syncer-mg976\" (UID: \"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424\") " pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:23.328370 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:23.328302 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mg976" Apr 21 10:04:23.448975 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:23.448946 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mg976"] Apr 21 10:04:23.452091 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:04:23.452054 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79560f0b_3eeb_4ad8_9eca_ac25ba5bf424.slice/crio-ea14f1292fd1238839eeab1d11f6485aed97a7a5cfb32237cf7643c391ac14cf WatchSource:0}: Error finding container ea14f1292fd1238839eeab1d11f6485aed97a7a5cfb32237cf7643c391ac14cf: Status 404 returned error can't find the container with id ea14f1292fd1238839eeab1d11f6485aed97a7a5cfb32237cf7643c391ac14cf Apr 21 10:04:23.837107 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:23.837075 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:23.837323 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:23.837127 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:23.837323 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:23.837211 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:23.837323 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:23.837228 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:23.837323 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:23.837250 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:23.837323 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:23.837263 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8477bf8587-zfnhn: secret "image-registry-tls" not found Apr 21 10:04:23.837323 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:23.837300 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls podName:a413cb28-d70b-44b6-a527-03a5247fa66a nodeName:}" failed. No retries permitted until 2026-04-21 10:04:31.837278296 +0000 UTC m=+49.682473796 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls") pod "dns-default-sznwb" (UID: "a413cb28-d70b-44b6-a527-03a5247fa66a") : secret "dns-default-metrics-tls" not found Apr 21 10:04:23.837323 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:23.837319 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls podName:dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:31.837310372 +0000 UTC m=+49.682505856 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls") pod "image-registry-8477bf8587-zfnhn" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55") : secret "image-registry-tls" not found Apr 21 10:04:23.837731 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:23.837301 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:23.837731 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:23.837418 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert podName:2857b675-4470-427f-a3d7-94390418dee9 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:31.837396918 +0000 UTC m=+49.682592407 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert") pod "ingress-canary-v9jx5" (UID: "2857b675-4470-427f-a3d7-94390418dee9") : secret "canary-serving-cert" not found Apr 21 10:04:23.928411 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:23.928378 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mg976" event={"ID":"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424","Type":"ContainerStarted","Data":"ea14f1292fd1238839eeab1d11f6485aed97a7a5cfb32237cf7643c391ac14cf"} Apr 21 10:04:25.436623 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.436592 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx"] Apr 21 10:04:25.440074 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.440053 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" Apr 21 10:04:25.443765 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.443741 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 10:04:25.443890 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.443862 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-pb52s\"" Apr 21 10:04:25.443890 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.443873 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 10:04:25.444003 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.443975 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 10:04:25.444111 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.444091 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 10:04:25.448614 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.448079 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k"] Apr 21 10:04:25.451763 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.451720 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx"] Apr 21 10:04:25.451863 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.451830 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:25.454090 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.454074 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 10:04:25.461208 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.461186 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k"] Apr 21 10:04:25.550823 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.550797 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41368385-0ad1-4320-855c-4961ee5a4480-tmp\") pod \"klusterlet-addon-workmgr-f8b496f-vw92k\" (UID: \"41368385-0ad1-4320-855c-4961ee5a4480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:25.550950 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.550830 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/41368385-0ad1-4320-855c-4961ee5a4480-klusterlet-config\") pod \"klusterlet-addon-workmgr-f8b496f-vw92k\" (UID: \"41368385-0ad1-4320-855c-4961ee5a4480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:25.550950 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.550872 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skx8q\" (UniqueName: \"kubernetes.io/projected/2914aaf1-22bc-4941-9622-0eca47323b36-kube-api-access-skx8q\") pod \"managed-serviceaccount-addon-agent-575d65cc55-wpzdx\" (UID: \"2914aaf1-22bc-4941-9622-0eca47323b36\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" Apr 21 10:04:25.550950 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.550935 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2914aaf1-22bc-4941-9622-0eca47323b36-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-575d65cc55-wpzdx\" (UID: \"2914aaf1-22bc-4941-9622-0eca47323b36\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" Apr 21 10:04:25.551115 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.551052 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbf8w\" (UniqueName: \"kubernetes.io/projected/41368385-0ad1-4320-855c-4961ee5a4480-kube-api-access-nbf8w\") pod \"klusterlet-addon-workmgr-f8b496f-vw92k\" (UID: \"41368385-0ad1-4320-855c-4961ee5a4480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:25.651490 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.651455 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbf8w\" (UniqueName: \"kubernetes.io/projected/41368385-0ad1-4320-855c-4961ee5a4480-kube-api-access-nbf8w\") pod \"klusterlet-addon-workmgr-f8b496f-vw92k\" (UID: \"41368385-0ad1-4320-855c-4961ee5a4480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:25.651621 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.651513 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41368385-0ad1-4320-855c-4961ee5a4480-tmp\") pod \"klusterlet-addon-workmgr-f8b496f-vw92k\" (UID: \"41368385-0ad1-4320-855c-4961ee5a4480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:25.651621 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.651544 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/41368385-0ad1-4320-855c-4961ee5a4480-klusterlet-config\") pod \"klusterlet-addon-workmgr-f8b496f-vw92k\" (UID: \"41368385-0ad1-4320-855c-4961ee5a4480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:25.651621 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.651597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skx8q\" (UniqueName: \"kubernetes.io/projected/2914aaf1-22bc-4941-9622-0eca47323b36-kube-api-access-skx8q\") pod \"managed-serviceaccount-addon-agent-575d65cc55-wpzdx\" (UID: \"2914aaf1-22bc-4941-9622-0eca47323b36\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" Apr 21 10:04:25.651840 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.651629 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2914aaf1-22bc-4941-9622-0eca47323b36-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-575d65cc55-wpzdx\" (UID: \"2914aaf1-22bc-4941-9622-0eca47323b36\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" Apr 21 10:04:25.651986 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.651959 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41368385-0ad1-4320-855c-4961ee5a4480-tmp\") pod \"klusterlet-addon-workmgr-f8b496f-vw92k\" (UID: \"41368385-0ad1-4320-855c-4961ee5a4480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:25.654319 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.654298 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/41368385-0ad1-4320-855c-4961ee5a4480-klusterlet-config\") pod \"klusterlet-addon-workmgr-f8b496f-vw92k\" (UID: \"41368385-0ad1-4320-855c-4961ee5a4480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:25.654422 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.654304 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2914aaf1-22bc-4941-9622-0eca47323b36-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-575d65cc55-wpzdx\" (UID: \"2914aaf1-22bc-4941-9622-0eca47323b36\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" Apr 21 10:04:25.659976 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.659931 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skx8q\" (UniqueName: \"kubernetes.io/projected/2914aaf1-22bc-4941-9622-0eca47323b36-kube-api-access-skx8q\") pod \"managed-serviceaccount-addon-agent-575d65cc55-wpzdx\" (UID: \"2914aaf1-22bc-4941-9622-0eca47323b36\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" Apr 21 10:04:25.660093 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.660056 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbf8w\" (UniqueName: \"kubernetes.io/projected/41368385-0ad1-4320-855c-4961ee5a4480-kube-api-access-nbf8w\") pod \"klusterlet-addon-workmgr-f8b496f-vw92k\" (UID: \"41368385-0ad1-4320-855c-4961ee5a4480\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:25.760603 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.760546 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" Apr 21 10:04:25.770213 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:25.770186 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:26.987948 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:26.987919 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k"] Apr 21 10:04:26.991581 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:04:26.991537 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41368385_0ad1_4320_855c_4961ee5a4480.slice/crio-af20f60dae5edee153b42773652026da9d71f9f6938ee3f685be3c1c9719f700 WatchSource:0}: Error finding container af20f60dae5edee153b42773652026da9d71f9f6938ee3f685be3c1c9719f700: Status 404 returned error can't find the container with id af20f60dae5edee153b42773652026da9d71f9f6938ee3f685be3c1c9719f700 Apr 21 10:04:27.002668 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:27.002625 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx"] Apr 21 10:04:27.006581 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:04:27.006550 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2914aaf1_22bc_4941_9622_0eca47323b36.slice/crio-5972b6efab2f0dbeb8f1473ad2152ee41b95bd2865b75c1ea39bda653869717e WatchSource:0}: Error finding container 5972b6efab2f0dbeb8f1473ad2152ee41b95bd2865b75c1ea39bda653869717e: Status 404 returned error can't find the container with id 5972b6efab2f0dbeb8f1473ad2152ee41b95bd2865b75c1ea39bda653869717e Apr 21 10:04:27.938052 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:27.938011 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" event={"ID":"41368385-0ad1-4320-855c-4961ee5a4480","Type":"ContainerStarted","Data":"af20f60dae5edee153b42773652026da9d71f9f6938ee3f685be3c1c9719f700"} Apr 21 10:04:27.939564 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:27.939495 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mg976" event={"ID":"79560f0b-3eeb-4ad8-9eca-ac25ba5bf424","Type":"ContainerStarted","Data":"190a046e39939c1a4e6b2061afe88759085d8f7ddec3aeae0e888252023e0480"} Apr 21 10:04:27.940826 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:27.940787 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" event={"ID":"2914aaf1-22bc-4941-9622-0eca47323b36","Type":"ContainerStarted","Data":"5972b6efab2f0dbeb8f1473ad2152ee41b95bd2865b75c1ea39bda653869717e"} Apr 21 10:04:27.955656 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:27.955381 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mg976" podStartSLOduration=33.479155481 podStartE2EDuration="36.955363839s" podCreationTimestamp="2026-04-21 10:03:51 +0000 UTC" firstStartedPulling="2026-04-21 10:04:23.453763533 +0000 UTC m=+41.298959017" lastFinishedPulling="2026-04-21 10:04:26.929971877 +0000 UTC m=+44.775167375" observedRunningTime="2026-04-21 10:04:27.954792471 +0000 UTC m=+45.799987978" watchObservedRunningTime="2026-04-21 10:04:27.955363839 +0000 UTC m=+45.800559348" Apr 21 10:04:29.946491 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:29.946449 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" event={"ID":"2914aaf1-22bc-4941-9622-0eca47323b36","Type":"ContainerStarted","Data":"efc5047acf63cae3123311f6b7e9b9ec5173ed8eb3d04bf3356233a07c3ba88b"} Apr 21 10:04:29.974152 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:29.974104 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" podStartSLOduration=2.223954323 podStartE2EDuration="4.974090053s" podCreationTimestamp="2026-04-21 10:04:25 +0000 UTC" firstStartedPulling="2026-04-21 10:04:27.00990937 +0000 UTC m=+44.855104860" lastFinishedPulling="2026-04-21 10:04:29.760045092 +0000 UTC m=+47.605240590" observedRunningTime="2026-04-21 10:04:29.973688155 +0000 UTC m=+47.818883660" watchObservedRunningTime="2026-04-21 10:04:29.974090053 +0000 UTC m=+47.819285832" Apr 21 10:04:31.901642 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:31.901567 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:31.901642 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:31.901633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:31.902154 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:31.901659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:31.902154 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:31.901736 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:31.902154 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:31.901747 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:31.902154 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:31.901841 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:31.902154 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:31.901859 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8477bf8587-zfnhn: secret "image-registry-tls" not found Apr 21 10:04:31.902154 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:31.901795 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert podName:2857b675-4470-427f-a3d7-94390418dee9 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:47.901779625 +0000 UTC m=+65.746975108 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert") pod "ingress-canary-v9jx5" (UID: "2857b675-4470-427f-a3d7-94390418dee9") : secret "canary-serving-cert" not found Apr 21 10:04:31.902154 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:31.901913 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls podName:a413cb28-d70b-44b6-a527-03a5247fa66a nodeName:}" failed. No retries permitted until 2026-04-21 10:04:47.90189693 +0000 UTC m=+65.747092415 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls") pod "dns-default-sznwb" (UID: "a413cb28-d70b-44b6-a527-03a5247fa66a") : secret "dns-default-metrics-tls" not found Apr 21 10:04:31.902154 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:31.901923 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls podName:dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:47.90191781 +0000 UTC m=+65.747113294 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls") pod "image-registry-8477bf8587-zfnhn" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55") : secret "image-registry-tls" not found Apr 21 10:04:31.951281 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:31.951230 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" event={"ID":"41368385-0ad1-4320-855c-4961ee5a4480","Type":"ContainerStarted","Data":"a9c5c0b2f3fb2cbe21ee067093eb485d948c6c2107db2f0d7479228dab8755c3"} Apr 21 10:04:31.951469 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:31.951452 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:31.952846 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:31.952823 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:04:31.981075 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:31.981036 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" podStartSLOduration=2.479397551 podStartE2EDuration="6.981020478s" podCreationTimestamp="2026-04-21 10:04:25 +0000 UTC" firstStartedPulling="2026-04-21 10:04:26.993406818 +0000 UTC m=+44.838602303" lastFinishedPulling="2026-04-21 10:04:31.495029733 +0000 UTC m=+49.340225230" observedRunningTime="2026-04-21 10:04:31.966315051 +0000 UTC m=+49.811510557" watchObservedRunningTime="2026-04-21 10:04:31.981020478 +0000 UTC m=+49.826216018" Apr 21 10:04:40.905186 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:40.905147 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tqctk" Apr 21 10:04:47.917993 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:47.917955 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:04:47.918367 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:47.917998 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:04:47.918367 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:47.918040 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:04:47.918367 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:47.918129 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:47.918367 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:47.918196 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:47.918367 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:47.918210 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert podName:2857b675-4470-427f-a3d7-94390418dee9 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:19.91819198 +0000 UTC m=+97.763387463 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert") pod "ingress-canary-v9jx5" (UID: "2857b675-4470-427f-a3d7-94390418dee9") : secret "canary-serving-cert" not found Apr 21 10:04:47.918367 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:47.918133 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:04:47.918367 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:47.918231 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8477bf8587-zfnhn: secret "image-registry-tls" not found Apr 21 10:04:47.918367 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:47.918258 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls podName:a413cb28-d70b-44b6-a527-03a5247fa66a nodeName:}" failed. No retries permitted until 2026-04-21 10:05:19.918237362 +0000 UTC m=+97.763432863 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls") pod "dns-default-sznwb" (UID: "a413cb28-d70b-44b6-a527-03a5247fa66a") : secret "dns-default-metrics-tls" not found Apr 21 10:04:47.918367 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:47.918276 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls podName:dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:19.918265806 +0000 UTC m=+97.763461304 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls") pod "image-registry-8477bf8587-zfnhn" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55") : secret "image-registry-tls" not found Apr 21 10:04:48.522251 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:48.522219 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:04:48.522400 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:48.522308 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dncx\" (UniqueName: \"kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx\") pod \"network-check-target-jzrcz\" (UID: \"5f39fe1c-70f8-4445-8cfd-646cb496d498\") " pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:48.524916 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:48.524897 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:04:48.524984 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:48.524931 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:04:48.532993 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:48.532973 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:04:48.533093 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:04:48.533050 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs podName:dbb00fc1-1258-4254-a360-3c350554925b nodeName:}" failed. No retries permitted until 2026-04-21 10:05:52.533030207 +0000 UTC m=+130.378225696 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs") pod "network-metrics-daemon-nwsw4" (UID: "dbb00fc1-1258-4254-a360-3c350554925b") : secret "metrics-daemon-secret" not found Apr 21 10:04:48.535401 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:48.535386 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:04:48.545254 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:48.545233 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dncx\" (UniqueName: \"kubernetes.io/projected/5f39fe1c-70f8-4445-8cfd-646cb496d498-kube-api-access-6dncx\") pod \"network-check-target-jzrcz\" (UID: \"5f39fe1c-70f8-4445-8cfd-646cb496d498\") " pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:48.623267 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:48.623245 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bjzzq\"" Apr 21 10:04:48.631220 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:48.631206 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:48.741655 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:48.741621 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jzrcz"] Apr 21 10:04:48.744319 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:04:48.744297 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f39fe1c_70f8_4445_8cfd_646cb496d498.slice/crio-8b223ac2e425ba3f8ef0b32295dcb803efbbda8047382c8c4f47b8b19f5a2d90 WatchSource:0}: Error finding container 8b223ac2e425ba3f8ef0b32295dcb803efbbda8047382c8c4f47b8b19f5a2d90: Status 404 returned error can't find the container with id 8b223ac2e425ba3f8ef0b32295dcb803efbbda8047382c8c4f47b8b19f5a2d90 Apr 21 10:04:48.981156 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:48.981122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jzrcz" event={"ID":"5f39fe1c-70f8-4445-8cfd-646cb496d498","Type":"ContainerStarted","Data":"8b223ac2e425ba3f8ef0b32295dcb803efbbda8047382c8c4f47b8b19f5a2d90"} Apr 21 10:04:51.988608 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:51.988569 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jzrcz" event={"ID":"5f39fe1c-70f8-4445-8cfd-646cb496d498","Type":"ContainerStarted","Data":"ee1759ac38af567af4ded11fb736675ab8a011ffaab8796a63a55ed821bad235"} Apr 21 10:04:51.989019 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:51.988698 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:04:52.006645 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:04:52.006596 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jzrcz" podStartSLOduration=67.503398278 podStartE2EDuration="1m10.006580961s" podCreationTimestamp="2026-04-21 10:03:42 +0000 UTC" firstStartedPulling="2026-04-21 10:04:48.746805659 +0000 UTC m=+66.592001142" lastFinishedPulling="2026-04-21 10:04:51.249988328 +0000 UTC m=+69.095183825" observedRunningTime="2026-04-21 10:04:52.005606363 +0000 UTC m=+69.850801869" watchObservedRunningTime="2026-04-21 10:04:52.006580961 +0000 UTC m=+69.851776458" Apr 21 10:05:19.929689 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:05:19.929651 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:05:19.929689 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:05:19.929693 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:05:19.930097 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:05:19.929731 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:05:19.930097 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:05:19.929805 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:05:19.930097 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:05:19.929864 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:05:19.930097 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:05:19.929810 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:05:19.930097 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:05:19.929898 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8477bf8587-zfnhn: secret "image-registry-tls" not found Apr 21 10:05:19.930097 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:05:19.929874 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert podName:2857b675-4470-427f-a3d7-94390418dee9 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:23.929859179 +0000 UTC m=+161.775054669 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert") pod "ingress-canary-v9jx5" (UID: "2857b675-4470-427f-a3d7-94390418dee9") : secret "canary-serving-cert" not found Apr 21 10:05:19.930097 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:05:19.929951 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls podName:a413cb28-d70b-44b6-a527-03a5247fa66a nodeName:}" failed. No retries permitted until 2026-04-21 10:06:23.929928911 +0000 UTC m=+161.775124403 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls") pod "dns-default-sznwb" (UID: "a413cb28-d70b-44b6-a527-03a5247fa66a") : secret "dns-default-metrics-tls" not found Apr 21 10:05:19.930097 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:05:19.929965 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls podName:dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:23.929958873 +0000 UTC m=+161.775154356 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls") pod "image-registry-8477bf8587-zfnhn" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55") : secret "image-registry-tls" not found Apr 21 10:05:22.992580 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:05:22.992549 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jzrcz" Apr 21 10:05:52.550421 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:05:52.550381 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:05:52.550882 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:05:52.550518 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:05:52.550882 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:05:52.550585 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs podName:dbb00fc1-1258-4254-a360-3c350554925b nodeName:}" failed. No retries permitted until 2026-04-21 10:07:54.550570392 +0000 UTC m=+252.395765876 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs") pod "network-metrics-daemon-nwsw4" (UID: "dbb00fc1-1258-4254-a360-3c350554925b") : secret "metrics-daemon-secret" not found Apr 21 10:06:13.166455 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:13.166421 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-svqbc"] Apr 21 10:06:13.169103 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:13.169084 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-svqbc" Apr 21 10:06:13.171760 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:13.171739 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-p54hj\"" Apr 21 10:06:13.182772 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:13.182752 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-svqbc"] Apr 21 10:06:13.293812 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:13.293792 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j2mz\" (UniqueName: \"kubernetes.io/projected/592d16c7-dbbe-4301-a523-7a9d396a1b51-kube-api-access-9j2mz\") pod \"network-check-source-8894fc9bd-svqbc\" (UID: \"592d16c7-dbbe-4301-a523-7a9d396a1b51\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-svqbc" Apr 21 10:06:13.394298 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:13.394274 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j2mz\" (UniqueName: \"kubernetes.io/projected/592d16c7-dbbe-4301-a523-7a9d396a1b51-kube-api-access-9j2mz\") pod \"network-check-source-8894fc9bd-svqbc\" (UID: \"592d16c7-dbbe-4301-a523-7a9d396a1b51\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-svqbc" Apr 21 10:06:13.403200 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:13.403179 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j2mz\" (UniqueName: \"kubernetes.io/projected/592d16c7-dbbe-4301-a523-7a9d396a1b51-kube-api-access-9j2mz\") pod \"network-check-source-8894fc9bd-svqbc\" (UID: \"592d16c7-dbbe-4301-a523-7a9d396a1b51\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-svqbc" Apr 21 10:06:13.478244 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:13.478184 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-svqbc" Apr 21 10:06:13.591357 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:13.591328 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-svqbc"] Apr 21 10:06:13.594729 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:06:13.594697 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod592d16c7_dbbe_4301_a523_7a9d396a1b51.slice/crio-d74a2d50ef61679f66d5600ba8c50d14759e7d3d021f2bc3d68c60ee262ecdac WatchSource:0}: Error finding container d74a2d50ef61679f66d5600ba8c50d14759e7d3d021f2bc3d68c60ee262ecdac: Status 404 returned error can't find the container with id d74a2d50ef61679f66d5600ba8c50d14759e7d3d021f2bc3d68c60ee262ecdac Apr 21 10:06:14.144213 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:14.144163 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-svqbc" event={"ID":"592d16c7-dbbe-4301-a523-7a9d396a1b51","Type":"ContainerStarted","Data":"1a0c0b0dbb7c445424ea3f449127754da473d75548aa1d69effc5de55fe711c9"} Apr 21 10:06:14.144213 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:14.144209 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-svqbc" event={"ID":"592d16c7-dbbe-4301-a523-7a9d396a1b51","Type":"ContainerStarted","Data":"d74a2d50ef61679f66d5600ba8c50d14759e7d3d021f2bc3d68c60ee262ecdac"} Apr 21 10:06:14.159476 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:14.159435 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-svqbc" podStartSLOduration=1.159421144 podStartE2EDuration="1.159421144s" podCreationTimestamp="2026-04-21 10:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:06:14.158676849 +0000 UTC m=+152.003872367" watchObservedRunningTime="2026-04-21 10:06:14.159421144 +0000 UTC m=+152.004616672" Apr 21 10:06:17.564930 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:17.564894 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf"] Apr 21 10:06:17.568065 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:17.568047 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf" Apr 21 10:06:17.570759 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:17.570728 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-lxnz2\"" Apr 21 10:06:17.570759 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:17.570728 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 10:06:17.570955 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:17.570738 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:17.577271 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:17.577243 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf"] Apr 21 10:06:17.722621 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:17.722594 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsndm\" (UniqueName: \"kubernetes.io/projected/8a442b1e-fc34-4b46-94e0-354888afd597-kube-api-access-hsndm\") pod \"migrator-74bb7799d9-nxvlf\" (UID: \"8a442b1e-fc34-4b46-94e0-354888afd597\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf" Apr 21 10:06:17.823215 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:17.823147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsndm\" (UniqueName: \"kubernetes.io/projected/8a442b1e-fc34-4b46-94e0-354888afd597-kube-api-access-hsndm\") pod \"migrator-74bb7799d9-nxvlf\" (UID: \"8a442b1e-fc34-4b46-94e0-354888afd597\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf" Apr 21 10:06:17.831302 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:17.831271 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsndm\" (UniqueName: \"kubernetes.io/projected/8a442b1e-fc34-4b46-94e0-354888afd597-kube-api-access-hsndm\") pod \"migrator-74bb7799d9-nxvlf\" (UID: \"8a442b1e-fc34-4b46-94e0-354888afd597\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf" Apr 21 10:06:17.877557 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:17.877527 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf" Apr 21 10:06:18.004199 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:18.004157 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf"] Apr 21 10:06:18.006146 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:06:18.006111 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a442b1e_fc34_4b46_94e0_354888afd597.slice/crio-268197b65723a70d8c5a73b09d5859c2a30e69406250131f10d2ed8218d12a1b WatchSource:0}: Error finding container 268197b65723a70d8c5a73b09d5859c2a30e69406250131f10d2ed8218d12a1b: Status 404 returned error can't find the container with id 268197b65723a70d8c5a73b09d5859c2a30e69406250131f10d2ed8218d12a1b Apr 21 10:06:18.154039 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:18.154003 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf" event={"ID":"8a442b1e-fc34-4b46-94e0-354888afd597","Type":"ContainerStarted","Data":"268197b65723a70d8c5a73b09d5859c2a30e69406250131f10d2ed8218d12a1b"} Apr 21 10:06:19.041493 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:19.041448 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" podUID="dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55" Apr 21 10:06:19.054327 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:19.054287 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-sznwb" podUID="a413cb28-d70b-44b6-a527-03a5247fa66a" Apr 21 10:06:19.074400 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:19.074359 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-v9jx5" podUID="2857b675-4470-427f-a3d7-94390418dee9" Apr 21 10:06:19.156582 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:19.156554 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:06:19.156790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:19.156768 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sznwb" Apr 21 10:06:19.721609 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:19.721563 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-nwsw4" podUID="dbb00fc1-1258-4254-a360-3c350554925b" Apr 21 10:06:20.160102 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:20.160060 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf" event={"ID":"8a442b1e-fc34-4b46-94e0-354888afd597","Type":"ContainerStarted","Data":"f5c5ab59e07b96ea9190d30d1dde5a10b1d36999fd351709f3c9c8111889024f"} Apr 21 10:06:20.160462 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:20.160106 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf" event={"ID":"8a442b1e-fc34-4b46-94e0-354888afd597","Type":"ContainerStarted","Data":"8e1eee6f2ed94945d0726d93df8717b24a0585fdcaff11923dfc3cc5995256ca"} Apr 21 10:06:20.176807 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:20.176735 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nxvlf" podStartSLOduration=1.9675795470000002 podStartE2EDuration="3.176720969s" podCreationTimestamp="2026-04-21 10:06:17 +0000 UTC" firstStartedPulling="2026-04-21 10:06:18.007882586 +0000 UTC m=+155.853078084" lastFinishedPulling="2026-04-21 10:06:19.217024022 +0000 UTC m=+157.062219506" observedRunningTime="2026-04-21 10:06:20.176094779 +0000 UTC m=+158.021290290" watchObservedRunningTime="2026-04-21 10:06:20.176720969 +0000 UTC m=+158.021916474" Apr 21 10:06:21.428533 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:21.428505 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s97qn_22eb8c40-ac2f-42ed-897c-2c7e11b8588c/dns-node-resolver/0.log" Apr 21 10:06:22.428852 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:22.428823 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wgkxz_f5719a9a-0eff-48f5-b634-e4d0a7216828/node-ca/0.log" Apr 21 10:06:23.428601 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:23.428568 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-nxvlf_8a442b1e-fc34-4b46-94e0-354888afd597/migrator/0.log" Apr 21 10:06:23.627296 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:23.627273 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-nxvlf_8a442b1e-fc34-4b46-94e0-354888afd597/graceful-termination/0.log" Apr 21 10:06:23.963299 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:23.963272 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls\") pod \"image-registry-8477bf8587-zfnhn\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:06:23.963460 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:23.963322 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:06:23.963460 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:23.963346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:06:23.963460 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:23.963411 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:23.963460 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:23.963421 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:06:23.963460 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:23.963428 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8477bf8587-zfnhn: secret "image-registry-tls" not found Apr 21 10:06:23.963669 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:23.963470 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls podName:a413cb28-d70b-44b6-a527-03a5247fa66a nodeName:}" failed. No retries permitted until 2026-04-21 10:08:25.963457616 +0000 UTC m=+283.808653105 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls") pod "dns-default-sznwb" (UID: "a413cb28-d70b-44b6-a527-03a5247fa66a") : secret "dns-default-metrics-tls" not found Apr 21 10:06:23.963669 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:23.963483 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls podName:dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55 nodeName:}" failed. No retries permitted until 2026-04-21 10:08:25.963477087 +0000 UTC m=+283.808672571 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls") pod "image-registry-8477bf8587-zfnhn" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55") : secret "image-registry-tls" not found Apr 21 10:06:23.963669 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:23.963498 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:06:23.963669 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:23.963559 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert podName:2857b675-4470-427f-a3d7-94390418dee9 nodeName:}" failed. No retries permitted until 2026-04-21 10:08:25.963541461 +0000 UTC m=+283.808736948 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert") pod "ingress-canary-v9jx5" (UID: "2857b675-4470-427f-a3d7-94390418dee9") : secret "canary-serving-cert" not found Apr 21 10:06:29.709988 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:29.709958 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:06:30.187880 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:30.187851 2567 generic.go:358] "Generic (PLEG): container finished" podID="2914aaf1-22bc-4941-9622-0eca47323b36" containerID="efc5047acf63cae3123311f6b7e9b9ec5173ed8eb3d04bf3356233a07c3ba88b" exitCode=255 Apr 21 10:06:30.188006 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:30.187911 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" event={"ID":"2914aaf1-22bc-4941-9622-0eca47323b36","Type":"ContainerDied","Data":"efc5047acf63cae3123311f6b7e9b9ec5173ed8eb3d04bf3356233a07c3ba88b"} Apr 21 10:06:30.193527 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:30.193510 2567 scope.go:117] "RemoveContainer" containerID="efc5047acf63cae3123311f6b7e9b9ec5173ed8eb3d04bf3356233a07c3ba88b" Apr 21 10:06:31.191720 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:31.191691 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-575d65cc55-wpzdx" event={"ID":"2914aaf1-22bc-4941-9622-0eca47323b36","Type":"ContainerStarted","Data":"d2161682dce55c9b267227437ca8cb8591dca42f7b94d2a45b9d42e9726e0842"} Apr 21 10:06:31.952008 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:31.951949 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" podUID="41368385-0ad1-4320-855c-4961ee5a4480" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 21 10:06:32.195474 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:32.195443 2567 generic.go:358] "Generic (PLEG): container finished" podID="41368385-0ad1-4320-855c-4961ee5a4480" containerID="a9c5c0b2f3fb2cbe21ee067093eb485d948c6c2107db2f0d7479228dab8755c3" exitCode=1 Apr 21 10:06:32.195795 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:32.195504 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" event={"ID":"41368385-0ad1-4320-855c-4961ee5a4480","Type":"ContainerDied","Data":"a9c5c0b2f3fb2cbe21ee067093eb485d948c6c2107db2f0d7479228dab8755c3"} Apr 21 10:06:32.195795 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:32.195764 2567 scope.go:117] "RemoveContainer" containerID="a9c5c0b2f3fb2cbe21ee067093eb485d948c6c2107db2f0d7479228dab8755c3" Apr 21 10:06:32.711507 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:32.711478 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:06:33.199679 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:33.199649 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" event={"ID":"41368385-0ad1-4320-855c-4961ee5a4480","Type":"ContainerStarted","Data":"9849afd79d61d6501e988c8ba90081a4a5eda8179318ba997d312039577e5195"} Apr 21 10:06:33.200064 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:33.199930 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:06:33.200477 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:33.200460 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f8b496f-vw92k" Apr 21 10:06:38.295735 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.295700 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wp4c9"] Apr 21 10:06:38.299266 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.299243 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.301806 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.301787 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 10:06:38.302904 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.302881 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 10:06:38.303002 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.302902 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 10:06:38.303002 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.302886 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wh5gv\"" Apr 21 10:06:38.303002 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.302889 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 10:06:38.310484 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.310465 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wp4c9"] Apr 21 10:06:38.463071 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.463044 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39147b7c-8d16-4801-8f60-a0cc5afd65e4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.463211 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.463079 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39147b7c-8d16-4801-8f60-a0cc5afd65e4-data-volume\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.463211 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.463112 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39147b7c-8d16-4801-8f60-a0cc5afd65e4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.463211 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.463189 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39147b7c-8d16-4801-8f60-a0cc5afd65e4-crio-socket\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.463352 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.463216 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntn47\" (UniqueName: \"kubernetes.io/projected/39147b7c-8d16-4801-8f60-a0cc5afd65e4-kube-api-access-ntn47\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.563956 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.563905 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39147b7c-8d16-4801-8f60-a0cc5afd65e4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.563956 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.563931 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39147b7c-8d16-4801-8f60-a0cc5afd65e4-data-volume\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.563956 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.563953 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39147b7c-8d16-4801-8f60-a0cc5afd65e4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.564092 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.563979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39147b7c-8d16-4801-8f60-a0cc5afd65e4-crio-socket\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.564092 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.563996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntn47\" (UniqueName: \"kubernetes.io/projected/39147b7c-8d16-4801-8f60-a0cc5afd65e4-kube-api-access-ntn47\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.564092 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.564058 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/39147b7c-8d16-4801-8f60-a0cc5afd65e4-crio-socket\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.564308 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.564289 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/39147b7c-8d16-4801-8f60-a0cc5afd65e4-data-volume\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.564501 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.564484 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/39147b7c-8d16-4801-8f60-a0cc5afd65e4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.566316 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.566301 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/39147b7c-8d16-4801-8f60-a0cc5afd65e4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.571619 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.571592 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntn47\" (UniqueName: \"kubernetes.io/projected/39147b7c-8d16-4801-8f60-a0cc5afd65e4-kube-api-access-ntn47\") pod \"insights-runtime-extractor-wp4c9\" (UID: \"39147b7c-8d16-4801-8f60-a0cc5afd65e4\") " pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.608356 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.608334 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wp4c9" Apr 21 10:06:38.719242 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:38.719216 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wp4c9"] Apr 21 10:06:38.722097 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:06:38.722067 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39147b7c_8d16_4801_8f60_a0cc5afd65e4.slice/crio-8e608261952908814829890cc02ba163254c2b136f4197ae356323c8f35b91a4 WatchSource:0}: Error finding container 8e608261952908814829890cc02ba163254c2b136f4197ae356323c8f35b91a4: Status 404 returned error can't find the container with id 8e608261952908814829890cc02ba163254c2b136f4197ae356323c8f35b91a4 Apr 21 10:06:39.217155 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:39.217122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wp4c9" event={"ID":"39147b7c-8d16-4801-8f60-a0cc5afd65e4","Type":"ContainerStarted","Data":"a22d0a00bdbff0573970abd026b7ea8fc34024164751f585bf90b6d5664df33a"} Apr 21 10:06:39.217344 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:39.217182 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wp4c9" event={"ID":"39147b7c-8d16-4801-8f60-a0cc5afd65e4","Type":"ContainerStarted","Data":"8e608261952908814829890cc02ba163254c2b136f4197ae356323c8f35b91a4"} Apr 21 10:06:40.221243 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:40.221204 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wp4c9" event={"ID":"39147b7c-8d16-4801-8f60-a0cc5afd65e4","Type":"ContainerStarted","Data":"2b102ce2c3974cfa607cf98cce834640161cffa8284574e7dad57916cdb57301"} Apr 21 10:06:41.225210 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:41.225163 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wp4c9" event={"ID":"39147b7c-8d16-4801-8f60-a0cc5afd65e4","Type":"ContainerStarted","Data":"5e44d874067c7ff1cafcfa3751d383db9664dcaa759e3d0e32dbe3b93ee2ccc1"} Apr 21 10:06:41.240905 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:41.240819 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wp4c9" podStartSLOduration=1.018232195 podStartE2EDuration="3.240802374s" podCreationTimestamp="2026-04-21 10:06:38 +0000 UTC" firstStartedPulling="2026-04-21 10:06:38.778925068 +0000 UTC m=+176.624120553" lastFinishedPulling="2026-04-21 10:06:41.001495247 +0000 UTC m=+178.846690732" observedRunningTime="2026-04-21 10:06:41.240704077 +0000 UTC m=+179.085899583" watchObservedRunningTime="2026-04-21 10:06:41.240802374 +0000 UTC m=+179.085997880" Apr 21 10:06:44.901604 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:44.901566 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-868f6794d6-kdp9r"] Apr 21 10:06:44.904721 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:44.904701 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:44.908430 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:44.908403 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 10:06:44.908606 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:44.908500 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 10:06:44.908606 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:44.908527 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 10:06:44.908606 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:44.908520 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 10:06:44.908606 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:44.908544 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 10:06:44.908606 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:44.908575 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 10:06:44.908606 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:44.908544 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 10:06:44.908887 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:44.908520 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2kpnx\"" Apr 21 10:06:44.914158 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:44.914133 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-868f6794d6-kdp9r"] Apr 21 10:06:45.010273 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.010239 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-config\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.010392 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.010279 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-oauth-serving-cert\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.010392 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.010328 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dctz2\" (UniqueName: \"kubernetes.io/projected/c186129b-d8c2-40ef-93c9-1eae2c22123f-kube-api-access-dctz2\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.010392 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.010363 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-oauth-config\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.010510 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.010432 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-serving-cert\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.010510 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.010452 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-service-ca\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.110990 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.110964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-serving-cert\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.111112 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.111000 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-service-ca\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.111112 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.111064 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-config\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.111112 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.111088 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-oauth-serving-cert\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.111112 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.111105 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dctz2\" (UniqueName: \"kubernetes.io/projected/c186129b-d8c2-40ef-93c9-1eae2c22123f-kube-api-access-dctz2\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.111349 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.111131 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-oauth-config\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.111881 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.111855 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-service-ca\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.111990 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.111966 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-config\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.112031 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.111966 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-oauth-serving-cert\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.113713 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.113686 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-serving-cert\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.113799 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.113755 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-oauth-config\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.119284 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.119252 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dctz2\" (UniqueName: \"kubernetes.io/projected/c186129b-d8c2-40ef-93c9-1eae2c22123f-kube-api-access-dctz2\") pod \"console-868f6794d6-kdp9r\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.215421 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.215395 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:45.336066 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:45.336039 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-868f6794d6-kdp9r"] Apr 21 10:06:45.338898 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:06:45.338869 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc186129b_d8c2_40ef_93c9_1eae2c22123f.slice/crio-8ea2d0d4b5cc6ad67211fb23ac09cdb3c4400970b1071907c428fec958927fe9 WatchSource:0}: Error finding container 8ea2d0d4b5cc6ad67211fb23ac09cdb3c4400970b1071907c428fec958927fe9: Status 404 returned error can't find the container with id 8ea2d0d4b5cc6ad67211fb23ac09cdb3c4400970b1071907c428fec958927fe9 Apr 21 10:06:46.241105 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:46.241072 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868f6794d6-kdp9r" event={"ID":"c186129b-d8c2-40ef-93c9-1eae2c22123f","Type":"ContainerStarted","Data":"8ea2d0d4b5cc6ad67211fb23ac09cdb3c4400970b1071907c428fec958927fe9"} Apr 21 10:06:49.250139 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:49.250099 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868f6794d6-kdp9r" event={"ID":"c186129b-d8c2-40ef-93c9-1eae2c22123f","Type":"ContainerStarted","Data":"57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1"} Apr 21 10:06:49.266827 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:49.266771 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-868f6794d6-kdp9r" podStartSLOduration=2.342842026 podStartE2EDuration="5.266756115s" podCreationTimestamp="2026-04-21 10:06:44 +0000 UTC" firstStartedPulling="2026-04-21 10:06:45.340691801 +0000 UTC m=+183.185887299" lastFinishedPulling="2026-04-21 10:06:48.264605903 +0000 UTC m=+186.109801388" observedRunningTime="2026-04-21 10:06:49.26611071 +0000 UTC m=+187.111306216" watchObservedRunningTime="2026-04-21 10:06:49.266756115 +0000 UTC m=+187.111951623" Apr 21 10:06:55.215808 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:55.215769 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:55.215808 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:55.215819 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:55.220767 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:55.220742 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:55.267770 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:55.267747 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:06:58.468617 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.468586 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gxg5w"] Apr 21 10:06:58.471024 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.471008 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:58.473908 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.473884 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 10:06:58.474023 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.473933 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 10:06:58.474023 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.473963 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-kftw8\"" Apr 21 10:06:58.474023 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.473963 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 10:06:58.474871 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.474854 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 10:06:58.474987 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.474971 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 10:06:58.482280 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.482259 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gxg5w"] Apr 21 10:06:58.503263 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.503236 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fff653d0-ff47-486a-af54-9c141f939ade-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:58.503349 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.503270 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz4rt\" (UniqueName: \"kubernetes.io/projected/fff653d0-ff47-486a-af54-9c141f939ade-kube-api-access-qz4rt\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:58.503421 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.503362 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fff653d0-ff47-486a-af54-9c141f939ade-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:58.503421 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.503408 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fff653d0-ff47-486a-af54-9c141f939ade-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:58.607931 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.607905 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fff653d0-ff47-486a-af54-9c141f939ade-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:58.608011 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.607979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz4rt\" (UniqueName: \"kubernetes.io/projected/fff653d0-ff47-486a-af54-9c141f939ade-kube-api-access-qz4rt\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:58.608087 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.608074 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fff653d0-ff47-486a-af54-9c141f939ade-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:58.608123 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.608113 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fff653d0-ff47-486a-af54-9c141f939ade-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:58.608640 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:58.608613 2567 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 10:06:58.608744 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:06:58.608693 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fff653d0-ff47-486a-af54-9c141f939ade-prometheus-operator-tls podName:fff653d0-ff47-486a-af54-9c141f939ade nodeName:}" failed. No retries permitted until 2026-04-21 10:06:59.108673805 +0000 UTC m=+196.953869288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/fff653d0-ff47-486a-af54-9c141f939ade-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-gxg5w" (UID: "fff653d0-ff47-486a-af54-9c141f939ade") : secret "prometheus-operator-tls" not found Apr 21 10:06:58.609187 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.609146 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fff653d0-ff47-486a-af54-9c141f939ade-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:58.610648 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.610628 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fff653d0-ff47-486a-af54-9c141f939ade-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:58.617953 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:58.617928 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz4rt\" (UniqueName: \"kubernetes.io/projected/fff653d0-ff47-486a-af54-9c141f939ade-kube-api-access-qz4rt\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:59.112403 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:59.112319 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fff653d0-ff47-486a-af54-9c141f939ade-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:59.114761 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:59.114735 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fff653d0-ff47-486a-af54-9c141f939ade-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gxg5w\" (UID: \"fff653d0-ff47-486a-af54-9c141f939ade\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:59.379700 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:59.379633 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" Apr 21 10:06:59.501407 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:06:59.501379 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gxg5w"] Apr 21 10:06:59.504893 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:06:59.504866 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff653d0_ff47_486a_af54_9c141f939ade.slice/crio-7e14848f32fa804d7c3b23ac80f7e50c9f2d712a6c5510de42038538e58ba49a WatchSource:0}: Error finding container 7e14848f32fa804d7c3b23ac80f7e50c9f2d712a6c5510de42038538e58ba49a: Status 404 returned error can't find the container with id 7e14848f32fa804d7c3b23ac80f7e50c9f2d712a6c5510de42038538e58ba49a Apr 21 10:07:00.277457 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:00.277414 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" event={"ID":"fff653d0-ff47-486a-af54-9c141f939ade","Type":"ContainerStarted","Data":"7e14848f32fa804d7c3b23ac80f7e50c9f2d712a6c5510de42038538e58ba49a"} Apr 21 10:07:00.730897 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:00.730851 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8477bf8587-zfnhn"] Apr 21 10:07:00.731361 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:07:00.731055 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" podUID="dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55" Apr 21 10:07:01.281489 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.281458 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:07:01.281489 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.281462 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" event={"ID":"fff653d0-ff47-486a-af54-9c141f939ade","Type":"ContainerStarted","Data":"da5a7ea2b4319a59fc308353721cd5cb97a3392d69c19deb54ffa977bdf48626"} Apr 21 10:07:01.281693 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.281498 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" event={"ID":"fff653d0-ff47-486a-af54-9c141f939ade","Type":"ContainerStarted","Data":"1341ea15594ffdca4979534a9cb41c7ad9f86f6aea143531c1ebe12cc26ff0f5"} Apr 21 10:07:01.285760 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.285736 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:07:01.307674 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.307635 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-gxg5w" podStartSLOduration=2.310333777 podStartE2EDuration="3.307621795s" podCreationTimestamp="2026-04-21 10:06:58 +0000 UTC" firstStartedPulling="2026-04-21 10:06:59.50677103 +0000 UTC m=+197.351966518" lastFinishedPulling="2026-04-21 10:07:00.504059048 +0000 UTC m=+198.349254536" observedRunningTime="2026-04-21 10:07:01.305849509 +0000 UTC m=+199.151045014" watchObservedRunningTime="2026-04-21 10:07:01.307621795 +0000 UTC m=+199.152817301" Apr 21 10:07:01.328847 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.328823 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-trusted-ca\") pod \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " Apr 21 10:07:01.328962 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.328864 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-installation-pull-secrets\") pod \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " Apr 21 10:07:01.328962 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.328906 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-ca-trust-extracted\") pod \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " Apr 21 10:07:01.328962 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.328941 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-image-registry-private-configuration\") pod \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " Apr 21 10:07:01.329126 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.328980 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-bound-sa-token\") pod \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " Apr 21 10:07:01.329126 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.329009 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trw66\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-kube-api-access-trw66\") pod \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " Apr 21 10:07:01.329126 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.329067 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-certificates\") pod \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\" (UID: \"dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55\") " Apr 21 10:07:01.329288 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.329150 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:07:01.329288 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.329237 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:01.329426 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.329406 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:01.329539 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.329524 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-certificates\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:01.329585 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.329544 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-trusted-ca\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:01.329585 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.329556 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-ca-trust-extracted\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:01.331489 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.331462 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-kube-api-access-trw66" (OuterVolumeSpecName: "kube-api-access-trw66") pod "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55"). InnerVolumeSpecName "kube-api-access-trw66". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:01.331489 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.331478 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:01.331637 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.331506 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:01.331637 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.331583 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55" (UID: "dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:01.430481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.430460 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-image-registry-private-configuration\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:01.430481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.430480 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-bound-sa-token\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:01.430605 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.430490 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trw66\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-kube-api-access-trw66\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:01.430605 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:01.430499 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-installation-pull-secrets\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:02.284056 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.284023 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8477bf8587-zfnhn" Apr 21 10:07:02.318480 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.318451 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8477bf8587-zfnhn"] Apr 21 10:07:02.322160 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.322137 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8477bf8587-zfnhn"] Apr 21 10:07:02.435822 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.435798 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55-registry-tls\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:02.713277 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.713246 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55" path="/var/lib/kubelet/pods/dd824aa1-2cfa-4e5a-8a4e-c56b004dcd55/volumes" Apr 21 10:07:02.850886 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.850859 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n842d"] Apr 21 10:07:02.855233 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.855214 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:02.862378 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.862357 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 10:07:02.862517 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.862360 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-h4zm7\"" Apr 21 10:07:02.862644 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.862623 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 10:07:02.875301 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.875277 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n842d"] Apr 21 10:07:02.910563 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.910543 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4chhb"] Apr 21 10:07:02.913516 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.913493 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:02.917096 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.917077 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 10:07:02.917362 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.917344 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ph7ls\"" Apr 21 10:07:02.917438 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.917401 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 10:07:02.917961 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.917946 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 10:07:02.935076 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.935053 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4chhb"] Apr 21 10:07:02.939385 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.939362 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhbkx\" (UniqueName: \"kubernetes.io/projected/852ca942-92ad-4073-8c15-89263d3beac6-kube-api-access-xhbkx\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:02.939458 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.939432 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/852ca942-92ad-4073-8c15-89263d3beac6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:02.939504 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.939461 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:02.939581 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.939504 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:02.939581 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.939553 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:02.939652 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.939615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:02.939690 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.939652 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/852ca942-92ad-4073-8c15-89263d3beac6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:02.939724 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.939684 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:02.939765 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.939722 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn4q2\" (UniqueName: \"kubernetes.io/projected/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-kube-api-access-gn4q2\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:02.939765 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.939751 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:02.955928 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.955908 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pjrhj"] Apr 21 10:07:02.958850 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.958835 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:02.961026 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.961010 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 10:07:02.961285 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.961270 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wtmfz\"" Apr 21 10:07:02.961557 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.961534 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 10:07:02.961797 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:02.961777 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 10:07:03.040555 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040526 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05e986e2-925b-4f4c-a251-7c43d0600377-sys\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.040648 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040562 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-tls\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.040648 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040593 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/852ca942-92ad-4073-8c15-89263d3beac6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.040648 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040610 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:03.040648 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040628 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/05e986e2-925b-4f4c-a251-7c43d0600377-root\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.040770 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040715 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww7gb\" (UniqueName: \"kubernetes.io/projected/05e986e2-925b-4f4c-a251-7c43d0600377-kube-api-access-ww7gb\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.040770 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040731 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-accelerators-collector-config\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.040770 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:07:03.040741 2567 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 10:07:03.040770 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040757 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.040902 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040778 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.040902 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:07:03.040792 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-openshift-state-metrics-tls podName:452d9e5b-6dd7-474f-94d3-175d4f3f8fcf nodeName:}" failed. No retries permitted until 2026-04-21 10:07:03.540773511 +0000 UTC m=+201.385968994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-n842d" (UID: "452d9e5b-6dd7-474f-94d3-175d4f3f8fcf") : secret "openshift-state-metrics-tls" not found Apr 21 10:07:03.040902 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040842 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.040902 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040870 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-textfile\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.040902 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:07:03.040880 2567 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 21 10:07:03.040902 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040883 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/852ca942-92ad-4073-8c15-89263d3beac6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.041079 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040910 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:03.041079 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:07:03.040934 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-tls podName:852ca942-92ad-4073-8c15-89263d3beac6 nodeName:}" failed. No retries permitted until 2026-04-21 10:07:03.54092235 +0000 UTC m=+201.386117834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-4chhb" (UID: "852ca942-92ad-4073-8c15-89263d3beac6") : secret "kube-state-metrics-tls" not found Apr 21 10:07:03.041079 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040962 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/852ca942-92ad-4073-8c15-89263d3beac6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.041079 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.040998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-wtmp\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.041079 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.041026 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:03.041288 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.041138 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05e986e2-925b-4f4c-a251-7c43d0600377-metrics-client-ca\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.041288 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.041186 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gn4q2\" (UniqueName: \"kubernetes.io/projected/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-kube-api-access-gn4q2\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:03.041288 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.041234 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.041288 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.041280 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhbkx\" (UniqueName: \"kubernetes.io/projected/852ca942-92ad-4073-8c15-89263d3beac6-kube-api-access-xhbkx\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.041604 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.041584 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:03.041670 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.041618 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/852ca942-92ad-4073-8c15-89263d3beac6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.041828 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.041812 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.043700 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.043679 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:03.043777 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.043733 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.068556 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.068526 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhbkx\" (UniqueName: \"kubernetes.io/projected/852ca942-92ad-4073-8c15-89263d3beac6-kube-api-access-xhbkx\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.073715 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.073692 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn4q2\" (UniqueName: \"kubernetes.io/projected/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-kube-api-access-gn4q2\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:03.141710 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141687 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.141793 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-textfile\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.141793 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141752 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-wtmp\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.141793 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141789 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05e986e2-925b-4f4c-a251-7c43d0600377-metrics-client-ca\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.141930 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141831 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05e986e2-925b-4f4c-a251-7c43d0600377-sys\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.141930 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141849 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-tls\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.141930 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141900 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/05e986e2-925b-4f4c-a251-7c43d0600377-root\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.141930 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141911 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05e986e2-925b-4f4c-a251-7c43d0600377-sys\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.141930 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141923 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ww7gb\" (UniqueName: \"kubernetes.io/projected/05e986e2-925b-4f4c-a251-7c43d0600377-kube-api-access-ww7gb\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.142193 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-wtmp\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.142193 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141982 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/05e986e2-925b-4f4c-a251-7c43d0600377-root\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.142193 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.141983 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-accelerators-collector-config\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.142193 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.142019 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-textfile\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.142193 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:07:03.142023 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 10:07:03.142193 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:07:03.142091 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-tls podName:05e986e2-925b-4f4c-a251-7c43d0600377 nodeName:}" failed. No retries permitted until 2026-04-21 10:07:03.642080769 +0000 UTC m=+201.487276253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-tls") pod "node-exporter-pjrhj" (UID: "05e986e2-925b-4f4c-a251-7c43d0600377") : secret "node-exporter-tls" not found Apr 21 10:07:03.142494 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.142396 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-accelerators-collector-config\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.142494 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.142455 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05e986e2-925b-4f4c-a251-7c43d0600377-metrics-client-ca\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.143696 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.143677 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.150584 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.150564 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww7gb\" (UniqueName: \"kubernetes.io/projected/05e986e2-925b-4f4c-a251-7c43d0600377-kube-api-access-ww7gb\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.544518 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.544495 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:03.544903 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.544543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.546680 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.546655 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/852ca942-92ad-4073-8c15-89263d3beac6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4chhb\" (UID: \"852ca942-92ad-4073-8c15-89263d3beac6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.546784 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.546754 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/452d9e5b-6dd7-474f-94d3-175d4f3f8fcf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-n842d\" (UID: \"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:03.645197 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.645158 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-tls\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.647014 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.646997 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05e986e2-925b-4f4c-a251-7c43d0600377-node-exporter-tls\") pod \"node-exporter-pjrhj\" (UID: \"05e986e2-925b-4f4c-a251-7c43d0600377\") " pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.764534 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.764507 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" Apr 21 10:07:03.822338 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.821468 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" Apr 21 10:07:03.867938 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.867905 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pjrhj" Apr 21 10:07:03.878599 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:07:03.878560 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05e986e2_925b_4f4c_a251_7c43d0600377.slice/crio-bc50a855e7a04f6872bebe9372c0e121e038e2eabc86c4358c449feb7ecef8b1 WatchSource:0}: Error finding container bc50a855e7a04f6872bebe9372c0e121e038e2eabc86c4358c449feb7ecef8b1: Status 404 returned error can't find the container with id bc50a855e7a04f6872bebe9372c0e121e038e2eabc86c4358c449feb7ecef8b1 Apr 21 10:07:03.891016 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.890996 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-n842d"] Apr 21 10:07:03.893241 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:07:03.893214 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod452d9e5b_6dd7_474f_94d3_175d4f3f8fcf.slice/crio-ee5d76f23bc69affe650969190a3ef161ff5151598d6d88d9d96b7c489b104be WatchSource:0}: Error finding container ee5d76f23bc69affe650969190a3ef161ff5151598d6d88d9d96b7c489b104be: Status 404 returned error can't find the container with id ee5d76f23bc69affe650969190a3ef161ff5151598d6d88d9d96b7c489b104be Apr 21 10:07:03.952304 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:03.952282 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4chhb"] Apr 21 10:07:03.954690 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:07:03.954662 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod852ca942_92ad_4073_8c15_89263d3beac6.slice/crio-c0e5b49071ad7fd9d20b4885b14e942aa903d3865100cdfe4de1a0e8ee715f3b WatchSource:0}: Error finding container c0e5b49071ad7fd9d20b4885b14e942aa903d3865100cdfe4de1a0e8ee715f3b: Status 404 returned error can't find the container with id c0e5b49071ad7fd9d20b4885b14e942aa903d3865100cdfe4de1a0e8ee715f3b Apr 21 10:07:04.019609 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.019587 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:07:04.023201 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.023180 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.028496 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.028223 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 10:07:04.028496 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.028356 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 10:07:04.028746 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.028537 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 10:07:04.028746 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.028611 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 10:07:04.028746 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.028541 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-h7khb\"" Apr 21 10:07:04.028897 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.028859 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 10:07:04.029287 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.028980 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 10:07:04.029287 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.029028 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 10:07:04.029287 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.029038 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 10:07:04.029645 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.029315 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 10:07:04.039665 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.039630 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:07:04.049442 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049420 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24b93fc8-5b72-4c3a-9ce5-31878a449724-config-out\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.049529 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049447 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.049529 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049469 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.049529 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049498 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24b93fc8-5b72-4c3a-9ce5-31878a449724-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.049529 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049527 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-web-config\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.049742 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049549 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24b93fc8-5b72-4c3a-9ce5-31878a449724-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.049742 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049566 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.049742 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049632 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24b93fc8-5b72-4c3a-9ce5-31878a449724-tls-assets\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.049742 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049658 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.049742 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049687 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/24b93fc8-5b72-4c3a-9ce5-31878a449724-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.049742 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049708 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.049742 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049731 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-config-volume\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.050123 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.049782 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82qpb\" (UniqueName: \"kubernetes.io/projected/24b93fc8-5b72-4c3a-9ce5-31878a449724-kube-api-access-82qpb\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.150708 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.150641 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24b93fc8-5b72-4c3a-9ce5-31878a449724-config-out\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.150708 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.150669 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.150708 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.150688 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.150708 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.150705 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24b93fc8-5b72-4c3a-9ce5-31878a449724-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.150954 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.150731 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-web-config\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.150954 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.150746 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24b93fc8-5b72-4c3a-9ce5-31878a449724-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.150954 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.150772 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.150954 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.150795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24b93fc8-5b72-4c3a-9ce5-31878a449724-tls-assets\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.150954 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.150811 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.150954 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.150845 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/24b93fc8-5b72-4c3a-9ce5-31878a449724-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.151325 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.151266 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/24b93fc8-5b72-4c3a-9ce5-31878a449724-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.151325 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:07:04.151293 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/24b93fc8-5b72-4c3a-9ce5-31878a449724-alertmanager-trusted-ca-bundle podName:24b93fc8-5b72-4c3a-9ce5-31878a449724 nodeName:}" failed. No retries permitted until 2026-04-21 10:07:04.651268477 +0000 UTC m=+202.496463979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/24b93fc8-5b72-4c3a-9ce5-31878a449724-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "24b93fc8-5b72-4c3a-9ce5-31878a449724") : configmap references non-existent config key: ca-bundle.crt Apr 21 10:07:04.152423 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.151656 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.152423 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.151725 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-config-volume\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.152423 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.151779 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82qpb\" (UniqueName: \"kubernetes.io/projected/24b93fc8-5b72-4c3a-9ce5-31878a449724-kube-api-access-82qpb\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.152423 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.151666 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/24b93fc8-5b72-4c3a-9ce5-31878a449724-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.153565 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.153521 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24b93fc8-5b72-4c3a-9ce5-31878a449724-config-out\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.153675 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.153661 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.154204 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.154146 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-web-config\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.154777 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.154752 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.155013 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.154992 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.155100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.155050 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24b93fc8-5b72-4c3a-9ce5-31878a449724-tls-assets\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.155100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.155092 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.155409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.155386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-config-volume\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.155589 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.155567 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/24b93fc8-5b72-4c3a-9ce5-31878a449724-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.161188 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.161155 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82qpb\" (UniqueName: \"kubernetes.io/projected/24b93fc8-5b72-4c3a-9ce5-31878a449724-kube-api-access-82qpb\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.291164 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.291127 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjrhj" event={"ID":"05e986e2-925b-4f4c-a251-7c43d0600377","Type":"ContainerStarted","Data":"bc50a855e7a04f6872bebe9372c0e121e038e2eabc86c4358c449feb7ecef8b1"} Apr 21 10:07:04.292591 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.292554 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" event={"ID":"852ca942-92ad-4073-8c15-89263d3beac6","Type":"ContainerStarted","Data":"c0e5b49071ad7fd9d20b4885b14e942aa903d3865100cdfe4de1a0e8ee715f3b"} Apr 21 10:07:04.294294 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.294269 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" event={"ID":"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf","Type":"ContainerStarted","Data":"374de92d2cc07cd30f7648482034fe160a1371611862097a5d33c476f1eb276a"} Apr 21 10:07:04.294395 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.294301 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" event={"ID":"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf","Type":"ContainerStarted","Data":"d43db619353f2e82602f066fb7dfa35c2eedbd02c3a18164e188411b553638d3"} Apr 21 10:07:04.294395 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.294315 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" event={"ID":"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf","Type":"ContainerStarted","Data":"ee5d76f23bc69affe650969190a3ef161ff5151598d6d88d9d96b7c489b104be"} Apr 21 10:07:04.655451 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.655428 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24b93fc8-5b72-4c3a-9ce5-31878a449724-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.656324 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.656304 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24b93fc8-5b72-4c3a-9ce5-31878a449724-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"24b93fc8-5b72-4c3a-9ce5-31878a449724\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:04.797263 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.796676 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-868f6794d6-kdp9r"] Apr 21 10:07:04.940402 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:04.940358 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:07:05.298933 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:05.298856 2567 generic.go:358] "Generic (PLEG): container finished" podID="05e986e2-925b-4f4c-a251-7c43d0600377" containerID="3d83d05ac15f8831d6f6083662bb61205bf39ad8f4a65acfe50b012d407214d9" exitCode=0 Apr 21 10:07:05.298933 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:05.298902 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjrhj" event={"ID":"05e986e2-925b-4f4c-a251-7c43d0600377","Type":"ContainerDied","Data":"3d83d05ac15f8831d6f6083662bb61205bf39ad8f4a65acfe50b012d407214d9"} Apr 21 10:07:05.625901 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:05.625794 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:07:05.630049 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:07:05.630012 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b93fc8_5b72_4c3a_9ce5_31878a449724.slice/crio-9f0a22f4693e04404a2c482e873f0570173ce818ca44dff551aa42b0b7215c87 WatchSource:0}: Error finding container 9f0a22f4693e04404a2c482e873f0570173ce818ca44dff551aa42b0b7215c87: Status 404 returned error can't find the container with id 9f0a22f4693e04404a2c482e873f0570173ce818ca44dff551aa42b0b7215c87 Apr 21 10:07:06.307642 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:06.307606 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjrhj" event={"ID":"05e986e2-925b-4f4c-a251-7c43d0600377","Type":"ContainerStarted","Data":"11b82bc4fe9ad5b8253794e6ab33f1aa77ee9a95c789eeb7ed8dfdee6c8943ea"} Apr 21 10:07:06.308097 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:06.307650 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjrhj" event={"ID":"05e986e2-925b-4f4c-a251-7c43d0600377","Type":"ContainerStarted","Data":"06d29a979a2ff7867a0471d6556a6688e76ca8aae825d42fc527ae71ee6dba31"} Apr 21 10:07:06.309699 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:06.309672 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" event={"ID":"852ca942-92ad-4073-8c15-89263d3beac6","Type":"ContainerStarted","Data":"0c0cde863508cca1a26b065e3272923de7559a639d5ac403f356ed7b383ee791"} Apr 21 10:07:06.309803 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:06.309706 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" event={"ID":"852ca942-92ad-4073-8c15-89263d3beac6","Type":"ContainerStarted","Data":"dbfa10872434e0746e9ab0af8b4df56e8da007079064697ddfb676ee8a533fc2"} Apr 21 10:07:06.309803 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:06.309721 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" event={"ID":"852ca942-92ad-4073-8c15-89263d3beac6","Type":"ContainerStarted","Data":"45f7ec277cdb2de6608ab5512e78037181c7b85c54a6da734963a9d8211e051d"} Apr 21 10:07:06.310780 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:06.310742 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24b93fc8-5b72-4c3a-9ce5-31878a449724","Type":"ContainerStarted","Data":"9f0a22f4693e04404a2c482e873f0570173ce818ca44dff551aa42b0b7215c87"} Apr 21 10:07:06.312722 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:06.312700 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" event={"ID":"452d9e5b-6dd7-474f-94d3-175d4f3f8fcf","Type":"ContainerStarted","Data":"bc726cdf84ad9dd705ede9afe5aa2399635c88defb3772c7fbaa363ef4b9d5d7"} Apr 21 10:07:06.325050 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:06.325008 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pjrhj" podStartSLOduration=3.646250806 podStartE2EDuration="4.324998387s" podCreationTimestamp="2026-04-21 10:07:02 +0000 UTC" firstStartedPulling="2026-04-21 10:07:03.880657607 +0000 UTC m=+201.725853091" lastFinishedPulling="2026-04-21 10:07:04.559405188 +0000 UTC m=+202.404600672" observedRunningTime="2026-04-21 10:07:06.323977676 +0000 UTC m=+204.169173183" watchObservedRunningTime="2026-04-21 10:07:06.324998387 +0000 UTC m=+204.170193893" Apr 21 10:07:06.343160 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:06.343116 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-4chhb" podStartSLOduration=2.812513931 podStartE2EDuration="4.343104092s" podCreationTimestamp="2026-04-21 10:07:02 +0000 UTC" firstStartedPulling="2026-04-21 10:07:03.957127374 +0000 UTC m=+201.802322858" lastFinishedPulling="2026-04-21 10:07:05.487717529 +0000 UTC m=+203.332913019" observedRunningTime="2026-04-21 10:07:06.342376273 +0000 UTC m=+204.187571780" watchObservedRunningTime="2026-04-21 10:07:06.343104092 +0000 UTC m=+204.188299576" Apr 21 10:07:06.359352 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:06.359311 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-n842d" podStartSLOduration=2.907536426 podStartE2EDuration="4.359301832s" podCreationTimestamp="2026-04-21 10:07:02 +0000 UTC" firstStartedPulling="2026-04-21 10:07:04.032372841 +0000 UTC m=+201.877568325" lastFinishedPulling="2026-04-21 10:07:05.484138232 +0000 UTC m=+203.329333731" observedRunningTime="2026-04-21 10:07:06.358341629 +0000 UTC m=+204.203537137" watchObservedRunningTime="2026-04-21 10:07:06.359301832 +0000 UTC m=+204.204497338" Apr 21 10:07:07.317209 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.317157 2567 generic.go:358] "Generic (PLEG): container finished" podID="24b93fc8-5b72-4c3a-9ce5-31878a449724" containerID="966936c4bcbfcc3ae016b5ee955581cb2053c2ba8dc9b71e43f14f6314e84b52" exitCode=0 Apr 21 10:07:07.317670 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.317206 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24b93fc8-5b72-4c3a-9ce5-31878a449724","Type":"ContainerDied","Data":"966936c4bcbfcc3ae016b5ee955581cb2053c2ba8dc9b71e43f14f6314e84b52"} Apr 21 10:07:07.464096 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.464066 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8577976cdf-42wrd"] Apr 21 10:07:07.467266 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.467242 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.471151 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.471132 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1vn5jnnt0vaon\"" Apr 21 10:07:07.471447 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.471429 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-fcm5g\"" Apr 21 10:07:07.471535 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.471512 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 10:07:07.471881 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.471856 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 10:07:07.471985 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.471967 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 10:07:07.472227 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.472212 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 10:07:07.478575 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.478554 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8577976cdf-42wrd"] Apr 21 10:07:07.581646 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.581596 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ae78a88-873e-47c5-96e2-71e175ad6366-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.581646 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.581635 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kffdr\" (UniqueName: \"kubernetes.io/projected/7ae78a88-873e-47c5-96e2-71e175ad6366-kube-api-access-kffdr\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.581774 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.581657 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae78a88-873e-47c5-96e2-71e175ad6366-client-ca-bundle\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.581774 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.581696 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7ae78a88-873e-47c5-96e2-71e175ad6366-audit-log\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.581774 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.581714 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7ae78a88-873e-47c5-96e2-71e175ad6366-metrics-server-audit-profiles\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.581774 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.581739 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7ae78a88-873e-47c5-96e2-71e175ad6366-secret-metrics-server-tls\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.581920 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.581786 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7ae78a88-873e-47c5-96e2-71e175ad6366-secret-metrics-server-client-certs\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.683058 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.683030 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ae78a88-873e-47c5-96e2-71e175ad6366-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.683204 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.683078 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kffdr\" (UniqueName: \"kubernetes.io/projected/7ae78a88-873e-47c5-96e2-71e175ad6366-kube-api-access-kffdr\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.683204 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.683113 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae78a88-873e-47c5-96e2-71e175ad6366-client-ca-bundle\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.683204 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.683139 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7ae78a88-873e-47c5-96e2-71e175ad6366-audit-log\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.683204 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.683164 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7ae78a88-873e-47c5-96e2-71e175ad6366-metrics-server-audit-profiles\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.683409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.683215 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7ae78a88-873e-47c5-96e2-71e175ad6366-secret-metrics-server-tls\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.683409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.683257 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7ae78a88-873e-47c5-96e2-71e175ad6366-secret-metrics-server-client-certs\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.683526 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.683507 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7ae78a88-873e-47c5-96e2-71e175ad6366-audit-log\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.683830 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.683804 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ae78a88-873e-47c5-96e2-71e175ad6366-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.684057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.684037 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7ae78a88-873e-47c5-96e2-71e175ad6366-metrics-server-audit-profiles\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.685694 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.685672 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7ae78a88-873e-47c5-96e2-71e175ad6366-secret-metrics-server-client-certs\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.685765 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.685744 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7ae78a88-873e-47c5-96e2-71e175ad6366-secret-metrics-server-tls\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.685801 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.685788 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae78a88-873e-47c5-96e2-71e175ad6366-client-ca-bundle\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.698378 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.698360 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kffdr\" (UniqueName: \"kubernetes.io/projected/7ae78a88-873e-47c5-96e2-71e175ad6366-kube-api-access-kffdr\") pod \"metrics-server-8577976cdf-42wrd\" (UID: \"7ae78a88-873e-47c5-96e2-71e175ad6366\") " pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.776074 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.776045 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:07.929865 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:07.929827 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8577976cdf-42wrd"] Apr 21 10:07:07.933744 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:07:07.933719 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ae78a88_873e_47c5_96e2_71e175ad6366.slice/crio-3330d654b92e6bb59e5f4d11dd8b744745bae7e33a4c6bb6711294c36682abd2 WatchSource:0}: Error finding container 3330d654b92e6bb59e5f4d11dd8b744745bae7e33a4c6bb6711294c36682abd2: Status 404 returned error can't find the container with id 3330d654b92e6bb59e5f4d11dd8b744745bae7e33a4c6bb6711294c36682abd2 Apr 21 10:07:08.322737 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:08.322694 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" event={"ID":"7ae78a88-873e-47c5-96e2-71e175ad6366","Type":"ContainerStarted","Data":"3330d654b92e6bb59e5f4d11dd8b744745bae7e33a4c6bb6711294c36682abd2"} Apr 21 10:07:09.331224 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:09.331159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24b93fc8-5b72-4c3a-9ce5-31878a449724","Type":"ContainerStarted","Data":"d7c4b82abf360b5d6e1f1b85952add55ab31fdb3f450dd8618b0cd3e5575e6c2"} Apr 21 10:07:09.331224 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:09.331230 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24b93fc8-5b72-4c3a-9ce5-31878a449724","Type":"ContainerStarted","Data":"2d08f3d01bdbf2c86962f62bfbe2c698a0a341f29f7be5f7ef757e1d08607dda"} Apr 21 10:07:09.331647 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:09.331243 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24b93fc8-5b72-4c3a-9ce5-31878a449724","Type":"ContainerStarted","Data":"9fee14a1ba726dc0c198b7d61f26eeb7643950c510a675cb7a309f9c75f03c30"} Apr 21 10:07:09.331647 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:09.331256 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24b93fc8-5b72-4c3a-9ce5-31878a449724","Type":"ContainerStarted","Data":"39daddaa92d819a72eb196fc5f6bd72f1f079a11ee01723b44a4ba287b11b8ca"} Apr 21 10:07:09.331647 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:09.331271 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24b93fc8-5b72-4c3a-9ce5-31878a449724","Type":"ContainerStarted","Data":"5fb87e8bcef9727ab208fa54ed2a9efa8460dcfcad1f8f001953897a4439b482"} Apr 21 10:07:10.335590 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:10.335496 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" event={"ID":"7ae78a88-873e-47c5-96e2-71e175ad6366","Type":"ContainerStarted","Data":"ebc532641ca39ff0de85b8f14ab381d671bc19677fde30e93e5b5a1df9ad6462"} Apr 21 10:07:10.338750 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:10.338722 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"24b93fc8-5b72-4c3a-9ce5-31878a449724","Type":"ContainerStarted","Data":"f7386ac292700e581b93a608308045da2ffdd327f768ababc2ecfd21d935caf0"} Apr 21 10:07:10.354850 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:10.354803 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" podStartSLOduration=1.615181975 podStartE2EDuration="3.35478757s" podCreationTimestamp="2026-04-21 10:07:07 +0000 UTC" firstStartedPulling="2026-04-21 10:07:07.935922238 +0000 UTC m=+205.781117728" lastFinishedPulling="2026-04-21 10:07:09.675527824 +0000 UTC m=+207.520723323" observedRunningTime="2026-04-21 10:07:10.352811626 +0000 UTC m=+208.198007133" watchObservedRunningTime="2026-04-21 10:07:10.35478757 +0000 UTC m=+208.199983076" Apr 21 10:07:10.376927 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:10.376853 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.014046646 podStartE2EDuration="7.376836076s" podCreationTimestamp="2026-04-21 10:07:03 +0000 UTC" firstStartedPulling="2026-04-21 10:07:05.631992526 +0000 UTC m=+203.477188020" lastFinishedPulling="2026-04-21 10:07:09.994781963 +0000 UTC m=+207.839977450" observedRunningTime="2026-04-21 10:07:10.375518332 +0000 UTC m=+208.220713869" watchObservedRunningTime="2026-04-21 10:07:10.376836076 +0000 UTC m=+208.222031583" Apr 21 10:07:13.785522 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.785486 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c466798d7-gf2vj"] Apr 21 10:07:13.790213 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.790193 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.797544 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.797519 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 10:07:13.807232 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.807213 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c466798d7-gf2vj"] Apr 21 10:07:13.837010 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.836990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-oauth-config\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.837101 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.837021 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-service-ca\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.837101 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.837055 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-trusted-ca-bundle\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.837191 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.837101 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-oauth-serving-cert\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.837232 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.837214 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbn2f\" (UniqueName: \"kubernetes.io/projected/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-kube-api-access-fbn2f\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.837277 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.837262 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-serving-cert\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.837323 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.837305 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-config\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.937735 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.937704 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbn2f\" (UniqueName: \"kubernetes.io/projected/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-kube-api-access-fbn2f\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.937829 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.937759 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-serving-cert\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.937829 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.937791 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-config\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.937928 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.937832 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-oauth-config\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.937928 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.937862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-service-ca\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.937928 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.937900 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-trusted-ca-bundle\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.938076 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.937926 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-oauth-serving-cert\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.938649 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.938620 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-config\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.938734 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.938682 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-service-ca\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.938818 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.938798 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-oauth-serving-cert\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.939080 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.939058 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-trusted-ca-bundle\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.940333 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.940312 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-serving-cert\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.940413 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.940316 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-oauth-config\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:13.945511 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:13.945490 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbn2f\" (UniqueName: \"kubernetes.io/projected/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-kube-api-access-fbn2f\") pod \"console-7c466798d7-gf2vj\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:14.100151 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:14.100071 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:14.232441 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:14.232417 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c466798d7-gf2vj"] Apr 21 10:07:14.236565 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:07:14.236521 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode26e59ec_1ac5_4b9e_8ede_30f1031a42b3.slice/crio-534d7540f05ee391bdd43f39c242e181c8099f3b24126904df01be8934663384 WatchSource:0}: Error finding container 534d7540f05ee391bdd43f39c242e181c8099f3b24126904df01be8934663384: Status 404 returned error can't find the container with id 534d7540f05ee391bdd43f39c242e181c8099f3b24126904df01be8934663384 Apr 21 10:07:14.353377 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:14.353302 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c466798d7-gf2vj" event={"ID":"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3","Type":"ContainerStarted","Data":"948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b"} Apr 21 10:07:14.353377 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:14.353340 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c466798d7-gf2vj" event={"ID":"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3","Type":"ContainerStarted","Data":"534d7540f05ee391bdd43f39c242e181c8099f3b24126904df01be8934663384"} Apr 21 10:07:14.369767 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:14.369710 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c466798d7-gf2vj" podStartSLOduration=1.369693512 podStartE2EDuration="1.369693512s" podCreationTimestamp="2026-04-21 10:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:07:14.36925357 +0000 UTC m=+212.214449075" watchObservedRunningTime="2026-04-21 10:07:14.369693512 +0000 UTC m=+212.214889019" Apr 21 10:07:24.100872 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:24.100830 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:24.100872 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:24.100881 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:24.105616 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:24.105595 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:24.386515 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:24.386442 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:07:27.776479 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:27.776447 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:27.776479 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:27.776480 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:29.822342 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:29.822290 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-868f6794d6-kdp9r" podUID="c186129b-d8c2-40ef-93c9-1eae2c22123f" containerName="console" containerID="cri-o://57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1" gracePeriod=15 Apr 21 10:07:30.052869 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.052850 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-868f6794d6-kdp9r_c186129b-d8c2-40ef-93c9-1eae2c22123f/console/0.log" Apr 21 10:07:30.052964 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.052916 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:07:30.166384 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.166362 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-service-ca\") pod \"c186129b-d8c2-40ef-93c9-1eae2c22123f\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " Apr 21 10:07:30.166480 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.166400 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-config\") pod \"c186129b-d8c2-40ef-93c9-1eae2c22123f\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " Apr 21 10:07:30.166480 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.166474 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-serving-cert\") pod \"c186129b-d8c2-40ef-93c9-1eae2c22123f\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " Apr 21 10:07:30.166550 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.166505 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-oauth-serving-cert\") pod \"c186129b-d8c2-40ef-93c9-1eae2c22123f\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " Apr 21 10:07:30.166550 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.166539 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dctz2\" (UniqueName: \"kubernetes.io/projected/c186129b-d8c2-40ef-93c9-1eae2c22123f-kube-api-access-dctz2\") pod \"c186129b-d8c2-40ef-93c9-1eae2c22123f\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " Apr 21 10:07:30.166610 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.166573 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-oauth-config\") pod \"c186129b-d8c2-40ef-93c9-1eae2c22123f\" (UID: \"c186129b-d8c2-40ef-93c9-1eae2c22123f\") " Apr 21 10:07:30.166873 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.166853 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-service-ca" (OuterVolumeSpecName: "service-ca") pod "c186129b-d8c2-40ef-93c9-1eae2c22123f" (UID: "c186129b-d8c2-40ef-93c9-1eae2c22123f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:30.166956 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.166886 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-config" (OuterVolumeSpecName: "console-config") pod "c186129b-d8c2-40ef-93c9-1eae2c22123f" (UID: "c186129b-d8c2-40ef-93c9-1eae2c22123f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:30.166956 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.166913 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c186129b-d8c2-40ef-93c9-1eae2c22123f" (UID: "c186129b-d8c2-40ef-93c9-1eae2c22123f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:30.168681 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.168652 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c186129b-d8c2-40ef-93c9-1eae2c22123f-kube-api-access-dctz2" (OuterVolumeSpecName: "kube-api-access-dctz2") pod "c186129b-d8c2-40ef-93c9-1eae2c22123f" (UID: "c186129b-d8c2-40ef-93c9-1eae2c22123f"). InnerVolumeSpecName "kube-api-access-dctz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:30.168774 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.168754 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c186129b-d8c2-40ef-93c9-1eae2c22123f" (UID: "c186129b-d8c2-40ef-93c9-1eae2c22123f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:30.168855 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.168821 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c186129b-d8c2-40ef-93c9-1eae2c22123f" (UID: "c186129b-d8c2-40ef-93c9-1eae2c22123f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:30.267447 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.267422 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-serving-cert\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:30.267447 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.267449 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-oauth-serving-cert\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:30.267588 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.267463 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dctz2\" (UniqueName: \"kubernetes.io/projected/c186129b-d8c2-40ef-93c9-1eae2c22123f-kube-api-access-dctz2\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:30.267588 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.267475 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-oauth-config\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:30.267588 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.267488 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-service-ca\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:30.267588 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.267501 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c186129b-d8c2-40ef-93c9-1eae2c22123f-console-config\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:07:30.402176 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.402154 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-868f6794d6-kdp9r_c186129b-d8c2-40ef-93c9-1eae2c22123f/console/0.log" Apr 21 10:07:30.402283 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.402223 2567 generic.go:358] "Generic (PLEG): container finished" podID="c186129b-d8c2-40ef-93c9-1eae2c22123f" containerID="57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1" exitCode=2 Apr 21 10:07:30.402348 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.402284 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868f6794d6-kdp9r" Apr 21 10:07:30.402348 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.402300 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868f6794d6-kdp9r" event={"ID":"c186129b-d8c2-40ef-93c9-1eae2c22123f","Type":"ContainerDied","Data":"57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1"} Apr 21 10:07:30.402348 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.402337 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868f6794d6-kdp9r" event={"ID":"c186129b-d8c2-40ef-93c9-1eae2c22123f","Type":"ContainerDied","Data":"8ea2d0d4b5cc6ad67211fb23ac09cdb3c4400970b1071907c428fec958927fe9"} Apr 21 10:07:30.402490 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.402374 2567 scope.go:117] "RemoveContainer" containerID="57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1" Apr 21 10:07:30.410290 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.410265 2567 scope.go:117] "RemoveContainer" containerID="57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1" Apr 21 10:07:30.410555 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:07:30.410534 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1\": container with ID starting with 57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1 not found: ID does not exist" containerID="57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1" Apr 21 10:07:30.410626 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.410562 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1"} err="failed to get container status \"57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1\": rpc error: code = NotFound desc = could not find container \"57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1\": container with ID starting with 57e900a07b300895474909068a2e444ee0cd9a317d1402245f7c1c013f0ce7d1 not found: ID does not exist" Apr 21 10:07:30.421330 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.421275 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-868f6794d6-kdp9r"] Apr 21 10:07:30.424871 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.424848 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-868f6794d6-kdp9r"] Apr 21 10:07:30.714608 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:30.714550 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c186129b-d8c2-40ef-93c9-1eae2c22123f" path="/var/lib/kubelet/pods/c186129b-d8c2-40ef-93c9-1eae2c22123f/volumes" Apr 21 10:07:47.782057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:47.782024 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:47.786004 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:47.785977 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8577976cdf-42wrd" Apr 21 10:07:54.643106 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:54.643027 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:07:54.645460 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:54.645431 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbb00fc1-1258-4254-a360-3c350554925b-metrics-certs\") pod \"network-metrics-daemon-nwsw4\" (UID: \"dbb00fc1-1258-4254-a360-3c350554925b\") " pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:07:54.915067 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:54.915036 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7zbjr\"" Apr 21 10:07:54.922932 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:54.922905 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nwsw4" Apr 21 10:07:55.045347 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:55.045303 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nwsw4"] Apr 21 10:07:55.049431 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:07:55.049407 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbb00fc1_1258_4254_a360_3c350554925b.slice/crio-c6be3c8febeb2f3b4b3dd4b4b8e694fe52fc5625f5e270a0ba318f93cc21fa79 WatchSource:0}: Error finding container c6be3c8febeb2f3b4b3dd4b4b8e694fe52fc5625f5e270a0ba318f93cc21fa79: Status 404 returned error can't find the container with id c6be3c8febeb2f3b4b3dd4b4b8e694fe52fc5625f5e270a0ba318f93cc21fa79 Apr 21 10:07:55.488683 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:55.488646 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nwsw4" event={"ID":"dbb00fc1-1258-4254-a360-3c350554925b","Type":"ContainerStarted","Data":"c6be3c8febeb2f3b4b3dd4b4b8e694fe52fc5625f5e270a0ba318f93cc21fa79"} Apr 21 10:07:56.493025 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:56.492992 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nwsw4" event={"ID":"dbb00fc1-1258-4254-a360-3c350554925b","Type":"ContainerStarted","Data":"79646c4391d0147605c1ef73c10c7862590eabdd06a734ad84e7f8422e77b83b"} Apr 21 10:07:56.493025 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:56.493026 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nwsw4" event={"ID":"dbb00fc1-1258-4254-a360-3c350554925b","Type":"ContainerStarted","Data":"a7c6cc9441a6bddf6285194a452351339979b4c7ccd7cf17d0fdb4c68a793a03"} Apr 21 10:07:56.509500 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:07:56.509434 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nwsw4" podStartSLOduration=252.466575855 podStartE2EDuration="4m13.509416497s" podCreationTimestamp="2026-04-21 10:03:43 +0000 UTC" firstStartedPulling="2026-04-21 10:07:55.051149887 +0000 UTC m=+252.896345371" lastFinishedPulling="2026-04-21 10:07:56.093990514 +0000 UTC m=+253.939186013" observedRunningTime="2026-04-21 10:07:56.507549981 +0000 UTC m=+254.352745682" watchObservedRunningTime="2026-04-21 10:07:56.509416497 +0000 UTC m=+254.354612003" Apr 21 10:08:22.157358 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:08:22.157319 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-sznwb" podUID="a413cb28-d70b-44b6-a527-03a5247fa66a" Apr 21 10:08:22.576509 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:22.576434 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sznwb" Apr 21 10:08:26.062478 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.062432 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:08:26.062478 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.062492 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:08:26.065059 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.065037 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a413cb28-d70b-44b6-a527-03a5247fa66a-metrics-tls\") pod \"dns-default-sznwb\" (UID: \"a413cb28-d70b-44b6-a527-03a5247fa66a\") " pod="openshift-dns/dns-default-sznwb" Apr 21 10:08:26.065420 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.065394 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2857b675-4470-427f-a3d7-94390418dee9-cert\") pod \"ingress-canary-v9jx5\" (UID: \"2857b675-4470-427f-a3d7-94390418dee9\") " pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:08:26.113238 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.113213 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9st8k\"" Apr 21 10:08:26.120732 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.120711 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v9jx5" Apr 21 10:08:26.180369 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.180333 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-g7j6z\"" Apr 21 10:08:26.188564 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.188530 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sznwb" Apr 21 10:08:26.240194 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.239339 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v9jx5"] Apr 21 10:08:26.243523 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:08:26.243499 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2857b675_4470_427f_a3d7_94390418dee9.slice/crio-d15c2e0c330071665869b6a621395dbc6dfc95ce8b1254c2c90f5c41a1469494 WatchSource:0}: Error finding container d15c2e0c330071665869b6a621395dbc6dfc95ce8b1254c2c90f5c41a1469494: Status 404 returned error can't find the container with id d15c2e0c330071665869b6a621395dbc6dfc95ce8b1254c2c90f5c41a1469494 Apr 21 10:08:26.311151 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.311125 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sznwb"] Apr 21 10:08:26.313838 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:08:26.313777 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda413cb28_d70b_44b6_a527_03a5247fa66a.slice/crio-7ecac8003b85c73048a5b219882b3bd9cecff284413558e2542a98c49914488f WatchSource:0}: Error finding container 7ecac8003b85c73048a5b219882b3bd9cecff284413558e2542a98c49914488f: Status 404 returned error can't find the container with id 7ecac8003b85c73048a5b219882b3bd9cecff284413558e2542a98c49914488f Apr 21 10:08:26.588075 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.588010 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sznwb" event={"ID":"a413cb28-d70b-44b6-a527-03a5247fa66a","Type":"ContainerStarted","Data":"7ecac8003b85c73048a5b219882b3bd9cecff284413558e2542a98c49914488f"} Apr 21 10:08:26.589046 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:26.589023 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v9jx5" event={"ID":"2857b675-4470-427f-a3d7-94390418dee9","Type":"ContainerStarted","Data":"d15c2e0c330071665869b6a621395dbc6dfc95ce8b1254c2c90f5c41a1469494"} Apr 21 10:08:28.598436 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:28.598359 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sznwb" event={"ID":"a413cb28-d70b-44b6-a527-03a5247fa66a","Type":"ContainerStarted","Data":"344030b0f05c8f79b8196b59e7cf49de5f8b085c73857fed1ef917ec4c8531f4"} Apr 21 10:08:28.598436 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:28.598406 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sznwb" event={"ID":"a413cb28-d70b-44b6-a527-03a5247fa66a","Type":"ContainerStarted","Data":"ae75b02203660a295f116b4d5af65334c44757515144cce6e8bbbee6731e828b"} Apr 21 10:08:28.598856 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:28.598595 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-sznwb" Apr 21 10:08:28.599764 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:28.599740 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v9jx5" event={"ID":"2857b675-4470-427f-a3d7-94390418dee9","Type":"ContainerStarted","Data":"1a34ce2eea4810634a9ca3ab17dc943b2d90a7716a2cab8843e867379491531d"} Apr 21 10:08:28.617455 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:28.617409 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sznwb" podStartSLOduration=251.693654862 podStartE2EDuration="4m13.617395064s" podCreationTimestamp="2026-04-21 10:04:15 +0000 UTC" firstStartedPulling="2026-04-21 10:08:26.315559028 +0000 UTC m=+284.160754514" lastFinishedPulling="2026-04-21 10:08:28.239299228 +0000 UTC m=+286.084494716" observedRunningTime="2026-04-21 10:08:28.615529284 +0000 UTC m=+286.460724791" watchObservedRunningTime="2026-04-21 10:08:28.617395064 +0000 UTC m=+286.462590567" Apr 21 10:08:28.631814 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:28.631778 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-v9jx5" podStartSLOduration=251.633900718 podStartE2EDuration="4m13.631760997s" podCreationTimestamp="2026-04-21 10:04:15 +0000 UTC" firstStartedPulling="2026-04-21 10:08:26.245522588 +0000 UTC m=+284.090718072" lastFinishedPulling="2026-04-21 10:08:28.243382866 +0000 UTC m=+286.088578351" observedRunningTime="2026-04-21 10:08:28.63048924 +0000 UTC m=+286.475684748" watchObservedRunningTime="2026-04-21 10:08:28.631760997 +0000 UTC m=+286.476956503" Apr 21 10:08:32.031013 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:32.030972 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c466798d7-gf2vj"] Apr 21 10:08:38.605538 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:38.605505 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sznwb" Apr 21 10:08:42.639660 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:42.639630 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:08:42.640206 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:42.639672 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:08:42.646484 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:42.646460 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 10:08:57.049983 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.049919 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c466798d7-gf2vj" podUID="e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" containerName="console" containerID="cri-o://948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b" gracePeriod=15 Apr 21 10:08:57.291355 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.291333 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c466798d7-gf2vj_e26e59ec-1ac5-4b9e-8ede-30f1031a42b3/console/0.log" Apr 21 10:08:57.291473 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.291405 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:08:57.376466 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.376410 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-trusted-ca-bundle\") pod \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " Apr 21 10:08:57.376466 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.376447 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-serving-cert\") pod \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " Apr 21 10:08:57.376618 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.376482 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-oauth-config\") pod \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " Apr 21 10:08:57.376618 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.376499 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-oauth-serving-cert\") pod \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " Apr 21 10:08:57.376695 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.376671 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbn2f\" (UniqueName: \"kubernetes.io/projected/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-kube-api-access-fbn2f\") pod \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " Apr 21 10:08:57.376760 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.376744 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-service-ca\") pod \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " Apr 21 10:08:57.376827 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.376754 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" (UID: "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:08:57.376827 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.376773 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-config\") pod \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\" (UID: \"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3\") " Apr 21 10:08:57.376930 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.376825 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" (UID: "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:08:57.377051 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.377026 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-oauth-serving-cert\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:08:57.377131 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.377051 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-trusted-ca-bundle\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:08:57.377131 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.377047 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-service-ca" (OuterVolumeSpecName: "service-ca") pod "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" (UID: "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:08:57.377259 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.377195 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-config" (OuterVolumeSpecName: "console-config") pod "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" (UID: "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:08:57.378509 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.378485 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" (UID: "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:57.379007 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.378988 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" (UID: "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:08:57.379007 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.378998 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-kube-api-access-fbn2f" (OuterVolumeSpecName: "kube-api-access-fbn2f") pod "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" (UID: "e26e59ec-1ac5-4b9e-8ede-30f1031a42b3"). InnerVolumeSpecName "kube-api-access-fbn2f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:08:57.477501 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.477471 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbn2f\" (UniqueName: \"kubernetes.io/projected/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-kube-api-access-fbn2f\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:08:57.477501 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.477502 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-service-ca\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:08:57.477621 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.477515 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-config\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:08:57.477621 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.477530 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-serving-cert\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:08:57.477621 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.477539 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3-console-oauth-config\") on node \"ip-10-0-140-231.ec2.internal\" DevicePath \"\"" Apr 21 10:08:57.694932 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.694911 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c466798d7-gf2vj_e26e59ec-1ac5-4b9e-8ede-30f1031a42b3/console/0.log" Apr 21 10:08:57.695051 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.694948 2567 generic.go:358] "Generic (PLEG): container finished" podID="e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" containerID="948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b" exitCode=2 Apr 21 10:08:57.695051 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.694987 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c466798d7-gf2vj" event={"ID":"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3","Type":"ContainerDied","Data":"948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b"} Apr 21 10:08:57.695051 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.695017 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c466798d7-gf2vj" Apr 21 10:08:57.695051 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.695042 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c466798d7-gf2vj" event={"ID":"e26e59ec-1ac5-4b9e-8ede-30f1031a42b3","Type":"ContainerDied","Data":"534d7540f05ee391bdd43f39c242e181c8099f3b24126904df01be8934663384"} Apr 21 10:08:57.695287 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.695066 2567 scope.go:117] "RemoveContainer" containerID="948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b" Apr 21 10:08:57.703779 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.703760 2567 scope.go:117] "RemoveContainer" containerID="948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b" Apr 21 10:08:57.704057 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:08:57.704037 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b\": container with ID starting with 948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b not found: ID does not exist" containerID="948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b" Apr 21 10:08:57.704117 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.704071 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b"} err="failed to get container status \"948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b\": rpc error: code = NotFound desc = could not find container \"948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b\": container with ID starting with 948181c650c266db540b7195e6d1114da96c3b52b73046f34e05254fd83a242b not found: ID does not exist" Apr 21 10:08:57.715518 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.715495 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c466798d7-gf2vj"] Apr 21 10:08:57.718870 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:57.718847 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c466798d7-gf2vj"] Apr 21 10:08:58.713369 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:08:58.713329 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" path="/var/lib/kubelet/pods/e26e59ec-1ac5-4b9e-8ede-30f1031a42b3/volumes" Apr 21 10:09:43.416125 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.416083 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m"] Apr 21 10:09:43.416772 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.416743 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" containerName="console" Apr 21 10:09:43.416898 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.416780 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" containerName="console" Apr 21 10:09:43.416898 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.416809 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c186129b-d8c2-40ef-93c9-1eae2c22123f" containerName="console" Apr 21 10:09:43.416898 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.416818 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c186129b-d8c2-40ef-93c9-1eae2c22123f" containerName="console" Apr 21 10:09:43.417047 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.416896 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c186129b-d8c2-40ef-93c9-1eae2c22123f" containerName="console" Apr 21 10:09:43.417047 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.416922 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e26e59ec-1ac5-4b9e-8ede-30f1031a42b3" containerName="console" Apr 21 10:09:43.421312 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.421291 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" Apr 21 10:09:43.423661 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.423633 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 21 10:09:43.423778 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.423677 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-2zrll\"" Apr 21 10:09:43.423778 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.423694 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 21 10:09:43.423778 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.423678 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 21 10:09:43.427808 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.427689 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m"] Apr 21 10:09:43.486602 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.486572 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fztbr\" (UniqueName: \"kubernetes.io/projected/0722c466-7d3c-4340-b49c-b9ef431a3528-kube-api-access-fztbr\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-46d9m\" (UID: \"0722c466-7d3c-4340-b49c-b9ef431a3528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" Apr 21 10:09:43.486717 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.486638 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0722c466-7d3c-4340-b49c-b9ef431a3528-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-46d9m\" (UID: \"0722c466-7d3c-4340-b49c-b9ef431a3528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" Apr 21 10:09:43.586953 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.586925 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fztbr\" (UniqueName: \"kubernetes.io/projected/0722c466-7d3c-4340-b49c-b9ef431a3528-kube-api-access-fztbr\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-46d9m\" (UID: \"0722c466-7d3c-4340-b49c-b9ef431a3528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" Apr 21 10:09:43.587056 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.586986 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0722c466-7d3c-4340-b49c-b9ef431a3528-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-46d9m\" (UID: \"0722c466-7d3c-4340-b49c-b9ef431a3528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" Apr 21 10:09:43.589402 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.589376 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0722c466-7d3c-4340-b49c-b9ef431a3528-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-46d9m\" (UID: \"0722c466-7d3c-4340-b49c-b9ef431a3528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" Apr 21 10:09:43.596317 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.596295 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fztbr\" (UniqueName: \"kubernetes.io/projected/0722c466-7d3c-4340-b49c-b9ef431a3528-kube-api-access-fztbr\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-46d9m\" (UID: \"0722c466-7d3c-4340-b49c-b9ef431a3528\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" Apr 21 10:09:43.732699 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.732645 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" Apr 21 10:09:43.854657 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.854629 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m"] Apr 21 10:09:43.856312 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:09:43.856287 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0722c466_7d3c_4340_b49c_b9ef431a3528.slice/crio-51a54c7e7319c46b12f5efea1bce9e9d986221c303ab7607992010cea85c855e WatchSource:0}: Error finding container 51a54c7e7319c46b12f5efea1bce9e9d986221c303ab7607992010cea85c855e: Status 404 returned error can't find the container with id 51a54c7e7319c46b12f5efea1bce9e9d986221c303ab7607992010cea85c855e Apr 21 10:09:43.857896 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:43.857878 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:09:44.825140 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:44.825099 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" event={"ID":"0722c466-7d3c-4340-b49c-b9ef431a3528","Type":"ContainerStarted","Data":"51a54c7e7319c46b12f5efea1bce9e9d986221c303ab7607992010cea85c855e"} Apr 21 10:09:47.836358 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:47.836325 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" event={"ID":"0722c466-7d3c-4340-b49c-b9ef431a3528","Type":"ContainerStarted","Data":"d36d63e04a259bb541c3cb1069d4275953d6d85058ae3ecf47fbe3ce173c0183"} Apr 21 10:09:47.836767 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:47.836438 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" Apr 21 10:09:47.886962 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:47.886529 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" podStartSLOduration=1.468486692 podStartE2EDuration="4.886503995s" podCreationTimestamp="2026-04-21 10:09:43 +0000 UTC" firstStartedPulling="2026-04-21 10:09:43.858036621 +0000 UTC m=+361.703232109" lastFinishedPulling="2026-04-21 10:09:47.276053923 +0000 UTC m=+365.121249412" observedRunningTime="2026-04-21 10:09:47.883148093 +0000 UTC m=+365.728343601" watchObservedRunningTime="2026-04-21 10:09:47.886503995 +0000 UTC m=+365.731699502" Apr 21 10:09:47.970444 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:47.970416 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lkqdc"] Apr 21 10:09:47.972329 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:47.972314 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:47.974666 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:47.974648 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 21 10:09:47.975497 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:47.975473 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-ggxhf\"" Apr 21 10:09:47.975587 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:47.975550 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 21 10:09:47.980867 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:47.980847 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lkqdc"] Apr 21 10:09:48.019059 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.019035 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/36bf0ccb-b37a-4885-821d-54597afb4102-certificates\") pod \"keda-operator-ffbb595cb-lkqdc\" (UID: \"36bf0ccb-b37a-4885-821d-54597afb4102\") " pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:48.019146 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.019095 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/36bf0ccb-b37a-4885-821d-54597afb4102-cabundle0\") pod \"keda-operator-ffbb595cb-lkqdc\" (UID: \"36bf0ccb-b37a-4885-821d-54597afb4102\") " pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:48.019146 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.019126 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kvr6\" (UniqueName: \"kubernetes.io/projected/36bf0ccb-b37a-4885-821d-54597afb4102-kube-api-access-6kvr6\") pod \"keda-operator-ffbb595cb-lkqdc\" (UID: \"36bf0ccb-b37a-4885-821d-54597afb4102\") " pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:48.120319 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.120254 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kvr6\" (UniqueName: \"kubernetes.io/projected/36bf0ccb-b37a-4885-821d-54597afb4102-kube-api-access-6kvr6\") pod \"keda-operator-ffbb595cb-lkqdc\" (UID: \"36bf0ccb-b37a-4885-821d-54597afb4102\") " pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:48.120319 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.120297 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/36bf0ccb-b37a-4885-821d-54597afb4102-certificates\") pod \"keda-operator-ffbb595cb-lkqdc\" (UID: \"36bf0ccb-b37a-4885-821d-54597afb4102\") " pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:48.120475 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.120342 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/36bf0ccb-b37a-4885-821d-54597afb4102-cabundle0\") pod \"keda-operator-ffbb595cb-lkqdc\" (UID: \"36bf0ccb-b37a-4885-821d-54597afb4102\") " pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:48.120475 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:09:48.120461 2567 secret.go:281] references non-existent secret key: ca.crt Apr 21 10:09:48.120597 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:09:48.120478 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 10:09:48.120597 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:09:48.120489 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lkqdc: references non-existent secret key: ca.crt Apr 21 10:09:48.120597 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:09:48.120561 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36bf0ccb-b37a-4885-821d-54597afb4102-certificates podName:36bf0ccb-b37a-4885-821d-54597afb4102 nodeName:}" failed. No retries permitted until 2026-04-21 10:09:48.620535154 +0000 UTC m=+366.465730638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/36bf0ccb-b37a-4885-821d-54597afb4102-certificates") pod "keda-operator-ffbb595cb-lkqdc" (UID: "36bf0ccb-b37a-4885-821d-54597afb4102") : references non-existent secret key: ca.crt Apr 21 10:09:48.121053 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.121027 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/36bf0ccb-b37a-4885-821d-54597afb4102-cabundle0\") pod \"keda-operator-ffbb595cb-lkqdc\" (UID: \"36bf0ccb-b37a-4885-821d-54597afb4102\") " pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:48.133930 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.133904 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kvr6\" (UniqueName: \"kubernetes.io/projected/36bf0ccb-b37a-4885-821d-54597afb4102-kube-api-access-6kvr6\") pod \"keda-operator-ffbb595cb-lkqdc\" (UID: \"36bf0ccb-b37a-4885-821d-54597afb4102\") " pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:48.283416 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.283394 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g"] Apr 21 10:09:48.285408 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.285394 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:48.292276 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.292258 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 21 10:09:48.310151 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.310130 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g"] Apr 21 10:09:48.322014 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.321990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgzh6\" (UniqueName: \"kubernetes.io/projected/73fef25e-49ba-4852-aef3-27e4b5b22d4b-kube-api-access-dgzh6\") pod \"keda-metrics-apiserver-7c9f485588-gg98g\" (UID: \"73fef25e-49ba-4852-aef3-27e4b5b22d4b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:48.322117 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.322041 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/73fef25e-49ba-4852-aef3-27e4b5b22d4b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gg98g\" (UID: \"73fef25e-49ba-4852-aef3-27e4b5b22d4b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:48.322117 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.322070 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/73fef25e-49ba-4852-aef3-27e4b5b22d4b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-gg98g\" (UID: \"73fef25e-49ba-4852-aef3-27e4b5b22d4b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:48.422907 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.422881 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgzh6\" (UniqueName: \"kubernetes.io/projected/73fef25e-49ba-4852-aef3-27e4b5b22d4b-kube-api-access-dgzh6\") pod \"keda-metrics-apiserver-7c9f485588-gg98g\" (UID: \"73fef25e-49ba-4852-aef3-27e4b5b22d4b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:48.423007 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.422939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/73fef25e-49ba-4852-aef3-27e4b5b22d4b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gg98g\" (UID: \"73fef25e-49ba-4852-aef3-27e4b5b22d4b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:48.423007 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.422966 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/73fef25e-49ba-4852-aef3-27e4b5b22d4b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-gg98g\" (UID: \"73fef25e-49ba-4852-aef3-27e4b5b22d4b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:48.423127 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:09:48.423111 2567 secret.go:281] references non-existent secret key: tls.crt Apr 21 10:09:48.423183 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:09:48.423135 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 10:09:48.423183 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:09:48.423153 2567 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 21 10:09:48.423259 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:09:48.423185 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 21 10:09:48.423259 ip-10-0-140-231 kubenswrapper[2567]: E0421 10:09:48.423245 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73fef25e-49ba-4852-aef3-27e4b5b22d4b-certificates podName:73fef25e-49ba-4852-aef3-27e4b5b22d4b nodeName:}" failed. No retries permitted until 2026-04-21 10:09:48.923227176 +0000 UTC m=+366.768422665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/73fef25e-49ba-4852-aef3-27e4b5b22d4b-certificates") pod "keda-metrics-apiserver-7c9f485588-gg98g" (UID: "73fef25e-49ba-4852-aef3-27e4b5b22d4b") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 21 10:09:48.423364 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.423347 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/73fef25e-49ba-4852-aef3-27e4b5b22d4b-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-gg98g\" (UID: \"73fef25e-49ba-4852-aef3-27e4b5b22d4b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:48.433964 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.433939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgzh6\" (UniqueName: \"kubernetes.io/projected/73fef25e-49ba-4852-aef3-27e4b5b22d4b-kube-api-access-dgzh6\") pod \"keda-metrics-apiserver-7c9f485588-gg98g\" (UID: \"73fef25e-49ba-4852-aef3-27e4b5b22d4b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:48.624886 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.624855 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/36bf0ccb-b37a-4885-821d-54597afb4102-certificates\") pod \"keda-operator-ffbb595cb-lkqdc\" (UID: \"36bf0ccb-b37a-4885-821d-54597afb4102\") " pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:48.627186 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.627147 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/36bf0ccb-b37a-4885-821d-54597afb4102-certificates\") pod \"keda-operator-ffbb595cb-lkqdc\" (UID: \"36bf0ccb-b37a-4885-821d-54597afb4102\") " pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:48.882777 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.882681 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:48.927361 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.927322 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/73fef25e-49ba-4852-aef3-27e4b5b22d4b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gg98g\" (UID: \"73fef25e-49ba-4852-aef3-27e4b5b22d4b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:48.932409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:48.932345 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/73fef25e-49ba-4852-aef3-27e4b5b22d4b-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gg98g\" (UID: \"73fef25e-49ba-4852-aef3-27e4b5b22d4b\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:49.033791 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:49.033768 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lkqdc"] Apr 21 10:09:49.035902 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:09:49.035872 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36bf0ccb_b37a_4885_821d_54597afb4102.slice/crio-a54fd08c1e9e00d5af81ce8b9db972701bd1576e394aef83b89dfc165ee43454 WatchSource:0}: Error finding container a54fd08c1e9e00d5af81ce8b9db972701bd1576e394aef83b89dfc165ee43454: Status 404 returned error can't find the container with id a54fd08c1e9e00d5af81ce8b9db972701bd1576e394aef83b89dfc165ee43454 Apr 21 10:09:49.195692 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:49.195670 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:49.311469 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:49.311447 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g"] Apr 21 10:09:49.313083 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:09:49.313052 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73fef25e_49ba_4852_aef3_27e4b5b22d4b.slice/crio-67446ddbcbb27c0e18da457758a393637fea5331c484cc0b491dc7723985d396 WatchSource:0}: Error finding container 67446ddbcbb27c0e18da457758a393637fea5331c484cc0b491dc7723985d396: Status 404 returned error can't find the container with id 67446ddbcbb27c0e18da457758a393637fea5331c484cc0b491dc7723985d396 Apr 21 10:09:49.844339 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:49.844297 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" event={"ID":"36bf0ccb-b37a-4885-821d-54597afb4102","Type":"ContainerStarted","Data":"a54fd08c1e9e00d5af81ce8b9db972701bd1576e394aef83b89dfc165ee43454"} Apr 21 10:09:49.846211 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:49.846160 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" event={"ID":"73fef25e-49ba-4852-aef3-27e4b5b22d4b","Type":"ContainerStarted","Data":"67446ddbcbb27c0e18da457758a393637fea5331c484cc0b491dc7723985d396"} Apr 21 10:09:53.860784 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:53.860748 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" event={"ID":"73fef25e-49ba-4852-aef3-27e4b5b22d4b","Type":"ContainerStarted","Data":"2192c1206a1953efbb7a8f8be9231ef8ca5a50c7249f56f53364c555f4905525"} Apr 21 10:09:53.861300 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:53.860820 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:09:53.862035 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:53.862012 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" event={"ID":"36bf0ccb-b37a-4885-821d-54597afb4102","Type":"ContainerStarted","Data":"47b73d5e29c5ca6f77f3920a32a36e777455b7354fd92f2e4654fb4595d62d04"} Apr 21 10:09:53.862136 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:53.862121 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:09:53.883645 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:53.883605 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" podStartSLOduration=2.388929352 podStartE2EDuration="5.883596136s" podCreationTimestamp="2026-04-21 10:09:48 +0000 UTC" firstStartedPulling="2026-04-21 10:09:49.314362493 +0000 UTC m=+367.159557978" lastFinishedPulling="2026-04-21 10:09:52.809029268 +0000 UTC m=+370.654224762" observedRunningTime="2026-04-21 10:09:53.88250873 +0000 UTC m=+371.727704237" watchObservedRunningTime="2026-04-21 10:09:53.883596136 +0000 UTC m=+371.728791642" Apr 21 10:09:53.912315 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:09:53.912276 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" podStartSLOduration=3.137325145 podStartE2EDuration="6.912264275s" podCreationTimestamp="2026-04-21 10:09:47 +0000 UTC" firstStartedPulling="2026-04-21 10:09:49.037364802 +0000 UTC m=+366.882560286" lastFinishedPulling="2026-04-21 10:09:52.812303931 +0000 UTC m=+370.657499416" observedRunningTime="2026-04-21 10:09:53.911251811 +0000 UTC m=+371.756447318" watchObservedRunningTime="2026-04-21 10:09:53.912264275 +0000 UTC m=+371.757459780" Apr 21 10:10:04.869620 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:04.869587 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gg98g" Apr 21 10:10:08.841279 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:08.841248 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-46d9m" Apr 21 10:10:14.867836 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:14.867806 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-lkqdc" Apr 21 10:10:54.584074 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.583995 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6"] Apr 21 10:10:54.587286 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.587270 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6" Apr 21 10:10:54.589819 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.589794 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 10:10:54.590238 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.590221 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:10:54.590497 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.590482 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-984qw\"" Apr 21 10:10:54.598258 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.598237 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6"] Apr 21 10:10:54.692193 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.692132 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/04d42108-edb7-4048-a9c0-de2634e6f84d-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-nxwp6\" (UID: \"04d42108-edb7-4048-a9c0-de2634e6f84d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6" Apr 21 10:10:54.692307 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.692233 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk9jz\" (UniqueName: \"kubernetes.io/projected/04d42108-edb7-4048-a9c0-de2634e6f84d-kube-api-access-nk9jz\") pod \"cert-manager-operator-controller-manager-54b9655956-nxwp6\" (UID: \"04d42108-edb7-4048-a9c0-de2634e6f84d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6" Apr 21 10:10:54.793067 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.793031 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/04d42108-edb7-4048-a9c0-de2634e6f84d-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-nxwp6\" (UID: \"04d42108-edb7-4048-a9c0-de2634e6f84d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6" Apr 21 10:10:54.793220 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.793095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk9jz\" (UniqueName: \"kubernetes.io/projected/04d42108-edb7-4048-a9c0-de2634e6f84d-kube-api-access-nk9jz\") pod \"cert-manager-operator-controller-manager-54b9655956-nxwp6\" (UID: \"04d42108-edb7-4048-a9c0-de2634e6f84d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6" Apr 21 10:10:54.793389 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.793369 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/04d42108-edb7-4048-a9c0-de2634e6f84d-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-nxwp6\" (UID: \"04d42108-edb7-4048-a9c0-de2634e6f84d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6" Apr 21 10:10:54.801662 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.801640 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk9jz\" (UniqueName: \"kubernetes.io/projected/04d42108-edb7-4048-a9c0-de2634e6f84d-kube-api-access-nk9jz\") pod \"cert-manager-operator-controller-manager-54b9655956-nxwp6\" (UID: \"04d42108-edb7-4048-a9c0-de2634e6f84d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6" Apr 21 10:10:54.896544 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:54.896483 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6" Apr 21 10:10:55.033419 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:55.033393 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6"] Apr 21 10:10:55.036438 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:10:55.036409 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04d42108_edb7_4048_a9c0_de2634e6f84d.slice/crio-dcd8a91035b7091c235be85a51c9d3be4e356ec906a0df5d77af021034bc9369 WatchSource:0}: Error finding container dcd8a91035b7091c235be85a51c9d3be4e356ec906a0df5d77af021034bc9369: Status 404 returned error can't find the container with id dcd8a91035b7091c235be85a51c9d3be4e356ec906a0df5d77af021034bc9369 Apr 21 10:10:55.053134 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:55.053112 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6" event={"ID":"04d42108-edb7-4048-a9c0-de2634e6f84d","Type":"ContainerStarted","Data":"dcd8a91035b7091c235be85a51c9d3be4e356ec906a0df5d77af021034bc9369"} Apr 21 10:10:58.065482 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:58.065438 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6" event={"ID":"04d42108-edb7-4048-a9c0-de2634e6f84d","Type":"ContainerStarted","Data":"0268875d4163f3ca03524b3d719bdfeb1261b432c4952345f2dc75672670503a"} Apr 21 10:10:58.086703 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:10:58.086650 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-nxwp6" podStartSLOduration=1.855217788 podStartE2EDuration="4.086629684s" podCreationTimestamp="2026-04-21 10:10:54 +0000 UTC" firstStartedPulling="2026-04-21 10:10:55.039808512 +0000 UTC m=+432.885003996" lastFinishedPulling="2026-04-21 10:10:57.271220404 +0000 UTC m=+435.116415892" observedRunningTime="2026-04-21 10:10:58.085327589 +0000 UTC m=+435.930523106" watchObservedRunningTime="2026-04-21 10:10:58.086629684 +0000 UTC m=+435.931825191" Apr 21 10:11:16.861130 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:16.861100 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb"] Apr 21 10:11:16.865512 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:16.865496 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb" Apr 21 10:11:16.868425 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:16.868397 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-bjz94\"" Apr 21 10:11:16.868538 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:16.868447 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 10:11:16.869293 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:16.869256 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:11:16.873216 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:16.873196 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb"] Apr 21 10:11:16.962777 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:16.962756 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9dab3e7b-4b04-4a4d-8226-ec361f24b1c5-tmp\") pod \"openshift-lws-operator-bfc7f696d-j6xkb\" (UID: \"9dab3e7b-4b04-4a4d-8226-ec361f24b1c5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb" Apr 21 10:11:16.962881 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:16.962801 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8fcw\" (UniqueName: \"kubernetes.io/projected/9dab3e7b-4b04-4a4d-8226-ec361f24b1c5-kube-api-access-b8fcw\") pod \"openshift-lws-operator-bfc7f696d-j6xkb\" (UID: \"9dab3e7b-4b04-4a4d-8226-ec361f24b1c5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb" Apr 21 10:11:17.063577 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:17.063540 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9dab3e7b-4b04-4a4d-8226-ec361f24b1c5-tmp\") pod \"openshift-lws-operator-bfc7f696d-j6xkb\" (UID: \"9dab3e7b-4b04-4a4d-8226-ec361f24b1c5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb" Apr 21 10:11:17.063695 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:17.063603 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8fcw\" (UniqueName: \"kubernetes.io/projected/9dab3e7b-4b04-4a4d-8226-ec361f24b1c5-kube-api-access-b8fcw\") pod \"openshift-lws-operator-bfc7f696d-j6xkb\" (UID: \"9dab3e7b-4b04-4a4d-8226-ec361f24b1c5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb" Apr 21 10:11:17.063883 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:17.063865 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9dab3e7b-4b04-4a4d-8226-ec361f24b1c5-tmp\") pod \"openshift-lws-operator-bfc7f696d-j6xkb\" (UID: \"9dab3e7b-4b04-4a4d-8226-ec361f24b1c5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb" Apr 21 10:11:17.073129 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:17.073108 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8fcw\" (UniqueName: \"kubernetes.io/projected/9dab3e7b-4b04-4a4d-8226-ec361f24b1c5-kube-api-access-b8fcw\") pod \"openshift-lws-operator-bfc7f696d-j6xkb\" (UID: \"9dab3e7b-4b04-4a4d-8226-ec361f24b1c5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb" Apr 21 10:11:17.175383 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:17.175364 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb" Apr 21 10:11:17.295297 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:17.295274 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb"] Apr 21 10:11:17.297897 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:11:17.297869 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dab3e7b_4b04_4a4d_8226_ec361f24b1c5.slice/crio-1c56254edd7fdb721217d3a4dab423ee4d15ca7d2ca4b7ed256dbd5a2792c481 WatchSource:0}: Error finding container 1c56254edd7fdb721217d3a4dab423ee4d15ca7d2ca4b7ed256dbd5a2792c481: Status 404 returned error can't find the container with id 1c56254edd7fdb721217d3a4dab423ee4d15ca7d2ca4b7ed256dbd5a2792c481 Apr 21 10:11:18.128812 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:18.128764 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb" event={"ID":"9dab3e7b-4b04-4a4d-8226-ec361f24b1c5","Type":"ContainerStarted","Data":"1c56254edd7fdb721217d3a4dab423ee4d15ca7d2ca4b7ed256dbd5a2792c481"} Apr 21 10:11:20.140838 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:20.140804 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb" event={"ID":"9dab3e7b-4b04-4a4d-8226-ec361f24b1c5","Type":"ContainerStarted","Data":"4e2fcf3a0aeea84f920763cbaf2da261adfe2e5def8ad2cb75ce28a8df966c1d"} Apr 21 10:11:20.160346 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:20.160300 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-j6xkb" podStartSLOduration=1.835003267 podStartE2EDuration="4.160285897s" podCreationTimestamp="2026-04-21 10:11:16 +0000 UTC" firstStartedPulling="2026-04-21 10:11:17.299159416 +0000 UTC m=+455.144354903" lastFinishedPulling="2026-04-21 10:11:19.624442046 +0000 UTC m=+457.469637533" observedRunningTime="2026-04-21 10:11:20.159686862 +0000 UTC m=+458.004882370" watchObservedRunningTime="2026-04-21 10:11:20.160285897 +0000 UTC m=+458.005481403" Apr 21 10:11:50.442969 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.442931 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5"] Apr 21 10:11:50.447028 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.447004 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" Apr 21 10:11:50.449617 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.449591 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 21 10:11:50.449734 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.449649 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 21 10:11:50.449734 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.449700 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-8stwx\"" Apr 21 10:11:50.458191 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.458146 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5"] Apr 21 10:11:50.510501 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.510474 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/59d1e727-361b-410c-a599-7ce374f53099-operator-config\") pod \"servicemesh-operator3-55f49c5f94-lfmf5\" (UID: \"59d1e727-361b-410c-a599-7ce374f53099\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" Apr 21 10:11:50.510604 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.510509 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwf7s\" (UniqueName: \"kubernetes.io/projected/59d1e727-361b-410c-a599-7ce374f53099-kube-api-access-lwf7s\") pod \"servicemesh-operator3-55f49c5f94-lfmf5\" (UID: \"59d1e727-361b-410c-a599-7ce374f53099\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" Apr 21 10:11:50.611575 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.611554 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/59d1e727-361b-410c-a599-7ce374f53099-operator-config\") pod \"servicemesh-operator3-55f49c5f94-lfmf5\" (UID: \"59d1e727-361b-410c-a599-7ce374f53099\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" Apr 21 10:11:50.611685 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.611585 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwf7s\" (UniqueName: \"kubernetes.io/projected/59d1e727-361b-410c-a599-7ce374f53099-kube-api-access-lwf7s\") pod \"servicemesh-operator3-55f49c5f94-lfmf5\" (UID: \"59d1e727-361b-410c-a599-7ce374f53099\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" Apr 21 10:11:50.614057 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.614039 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/59d1e727-361b-410c-a599-7ce374f53099-operator-config\") pod \"servicemesh-operator3-55f49c5f94-lfmf5\" (UID: \"59d1e727-361b-410c-a599-7ce374f53099\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" Apr 21 10:11:50.621139 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.621116 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwf7s\" (UniqueName: \"kubernetes.io/projected/59d1e727-361b-410c-a599-7ce374f53099-kube-api-access-lwf7s\") pod \"servicemesh-operator3-55f49c5f94-lfmf5\" (UID: \"59d1e727-361b-410c-a599-7ce374f53099\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" Apr 21 10:11:50.757302 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.757230 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" Apr 21 10:11:50.890874 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:50.890853 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5"] Apr 21 10:11:50.893211 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:11:50.893184 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59d1e727_361b_410c_a599_7ce374f53099.slice/crio-ccdc02e822b8418507077298377436e0278012696a4d97039543393618e95f1d WatchSource:0}: Error finding container ccdc02e822b8418507077298377436e0278012696a4d97039543393618e95f1d: Status 404 returned error can't find the container with id ccdc02e822b8418507077298377436e0278012696a4d97039543393618e95f1d Apr 21 10:11:51.240884 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:11:51.240854 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" event={"ID":"59d1e727-361b-410c-a599-7ce374f53099","Type":"ContainerStarted","Data":"ccdc02e822b8418507077298377436e0278012696a4d97039543393618e95f1d"} Apr 21 10:12:07.300340 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.300293 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" event={"ID":"59d1e727-361b-410c-a599-7ce374f53099","Type":"ContainerStarted","Data":"fc0f22f319007959490b20162eea7c194cb67cf423aaae4f46a908dd590b75d9"} Apr 21 10:12:07.300758 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.300418 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" Apr 21 10:12:07.322983 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.322929 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" podStartSLOduration=1.6468673539999998 podStartE2EDuration="17.322912539s" podCreationTimestamp="2026-04-21 10:11:50 +0000 UTC" firstStartedPulling="2026-04-21 10:11:50.895605862 +0000 UTC m=+488.740801352" lastFinishedPulling="2026-04-21 10:12:06.571651054 +0000 UTC m=+504.416846537" observedRunningTime="2026-04-21 10:12:07.319213579 +0000 UTC m=+505.164409085" watchObservedRunningTime="2026-04-21 10:12:07.322912539 +0000 UTC m=+505.168108047" Apr 21 10:12:07.526393 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.526361 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4"] Apr 21 10:12:07.530278 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.530255 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.532733 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.532710 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 10:12:07.533108 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.533088 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 10:12:07.533108 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.533096 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 10:12:07.533466 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.533449 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 10:12:07.533545 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.533470 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 10:12:07.533545 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.533450 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 10:12:07.533774 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.533754 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-bn6z2\"" Apr 21 10:12:07.545698 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.545679 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4"] Apr 21 10:12:07.643674 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.643613 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.643674 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.643656 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4649\" (UniqueName: \"kubernetes.io/projected/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-kube-api-access-f4649\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.643809 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.643722 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.643809 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.643756 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.643809 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.643778 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.643906 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.643855 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.643906 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.643892 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.745182 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.745147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.745343 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.745215 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.745343 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.745241 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.745343 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.745285 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.745343 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.745331 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.745529 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.745358 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.745529 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.745394 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4649\" (UniqueName: \"kubernetes.io/projected/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-kube-api-access-f4649\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.745977 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.745949 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.747578 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.747545 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.747827 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.747807 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.747910 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.747894 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.748124 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.748102 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.753037 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.753011 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.753649 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.753627 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4649\" (UniqueName: \"kubernetes.io/projected/71ff2aa4-7476-46c1-ba0a-8a36acc8bc81-kube-api-access-f4649\") pod \"istiod-openshift-gateway-7cd77c7ffd-svbn4\" (UID: \"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.840370 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.840349 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:07.976080 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:07.976009 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4"] Apr 21 10:12:07.978988 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:12:07.978957 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71ff2aa4_7476_46c1_ba0a_8a36acc8bc81.slice/crio-3638ffa5b9806571a9f1c9389bce3e1b077b5e267195f74d0cea99dc410b85b3 WatchSource:0}: Error finding container 3638ffa5b9806571a9f1c9389bce3e1b077b5e267195f74d0cea99dc410b85b3: Status 404 returned error can't find the container with id 3638ffa5b9806571a9f1c9389bce3e1b077b5e267195f74d0cea99dc410b85b3 Apr 21 10:12:08.304714 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:08.304679 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" event={"ID":"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81","Type":"ContainerStarted","Data":"3638ffa5b9806571a9f1c9389bce3e1b077b5e267195f74d0cea99dc410b85b3"} Apr 21 10:12:10.331060 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:10.331023 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 21 10:12:10.331395 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:10.331088 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 21 10:12:11.316463 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:11.316421 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" event={"ID":"71ff2aa4-7476-46c1-ba0a-8a36acc8bc81","Type":"ContainerStarted","Data":"74fd4af6801737b77a3c0e6178ab59858bed604edaabb139f6eaf2c97813d55f"} Apr 21 10:12:11.317483 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:11.317455 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:11.319253 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:11.319234 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" Apr 21 10:12:11.337757 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:11.337709 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-svbn4" podStartSLOduration=1.988314714 podStartE2EDuration="4.337695996s" podCreationTimestamp="2026-04-21 10:12:07 +0000 UTC" firstStartedPulling="2026-04-21 10:12:07.981417246 +0000 UTC m=+505.826612729" lastFinishedPulling="2026-04-21 10:12:10.330798521 +0000 UTC m=+508.175994011" observedRunningTime="2026-04-21 10:12:11.335950712 +0000 UTC m=+509.181146219" watchObservedRunningTime="2026-04-21 10:12:11.337695996 +0000 UTC m=+509.182891502" Apr 21 10:12:13.718567 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.718532 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59776c4dff-zcnwx"] Apr 21 10:12:13.722184 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.722149 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:13.724591 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.724564 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 10:12:13.725251 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.725235 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 10:12:13.725627 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.725610 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 10:12:13.726387 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.726367 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 10:12:13.726477 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.726438 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2kpnx\"" Apr 21 10:12:13.726477 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.726453 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 10:12:13.726597 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.726522 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 10:12:13.726897 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.726881 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 10:12:13.729846 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.729825 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 10:12:13.733963 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.733941 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59776c4dff-zcnwx"] Apr 21 10:12:13.905043 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.905014 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52e96a96-11c1-4086-96f6-904e0bc06745-console-serving-cert\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:13.905043 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.905044 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-service-ca\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:13.905261 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.905066 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-trusted-ca-bundle\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:13.905261 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.905140 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52e96a96-11c1-4086-96f6-904e0bc06745-console-oauth-config\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:13.905261 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.905188 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-oauth-serving-cert\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:13.905261 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.905225 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-console-config\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:13.905261 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:13.905257 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv248\" (UniqueName: \"kubernetes.io/projected/52e96a96-11c1-4086-96f6-904e0bc06745-kube-api-access-fv248\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.005710 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.005642 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52e96a96-11c1-4086-96f6-904e0bc06745-console-oauth-config\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.005710 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.005668 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-oauth-serving-cert\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.005710 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.005695 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-console-config\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.005710 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.005712 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fv248\" (UniqueName: \"kubernetes.io/projected/52e96a96-11c1-4086-96f6-904e0bc06745-kube-api-access-fv248\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.005992 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.005751 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52e96a96-11c1-4086-96f6-904e0bc06745-console-serving-cert\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.005992 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.005767 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-service-ca\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.005992 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.005792 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-trusted-ca-bundle\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.006467 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.006443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-console-config\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.006593 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.006552 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-oauth-serving-cert\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.006660 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.006594 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-service-ca\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.006719 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.006662 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52e96a96-11c1-4086-96f6-904e0bc06745-trusted-ca-bundle\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.008289 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.008263 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52e96a96-11c1-4086-96f6-904e0bc06745-console-oauth-config\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.008413 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.008395 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52e96a96-11c1-4086-96f6-904e0bc06745-console-serving-cert\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.013361 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.013340 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv248\" (UniqueName: \"kubernetes.io/projected/52e96a96-11c1-4086-96f6-904e0bc06745-kube-api-access-fv248\") pod \"console-59776c4dff-zcnwx\" (UID: \"52e96a96-11c1-4086-96f6-904e0bc06745\") " pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.035363 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.035342 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:14.163666 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.163598 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59776c4dff-zcnwx"] Apr 21 10:12:14.170278 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:12:14.170243 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52e96a96_11c1_4086_96f6_904e0bc06745.slice/crio-25fc2c45054b11467ff236f642806a15606535773a8abcf246db309ce8a7ea9e WatchSource:0}: Error finding container 25fc2c45054b11467ff236f642806a15606535773a8abcf246db309ce8a7ea9e: Status 404 returned error can't find the container with id 25fc2c45054b11467ff236f642806a15606535773a8abcf246db309ce8a7ea9e Apr 21 10:12:14.329269 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.329180 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59776c4dff-zcnwx" event={"ID":"52e96a96-11c1-4086-96f6-904e0bc06745","Type":"ContainerStarted","Data":"f5baef91371ad906127c470d5cb7436beeef45d1293a1f51aa915bc7dfd70af1"} Apr 21 10:12:14.329269 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.329219 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59776c4dff-zcnwx" event={"ID":"52e96a96-11c1-4086-96f6-904e0bc06745","Type":"ContainerStarted","Data":"25fc2c45054b11467ff236f642806a15606535773a8abcf246db309ce8a7ea9e"} Apr 21 10:12:14.347252 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:14.347204 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59776c4dff-zcnwx" podStartSLOduration=1.34719081 podStartE2EDuration="1.34719081s" podCreationTimestamp="2026-04-21 10:12:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:12:14.345195719 +0000 UTC m=+512.190391222" watchObservedRunningTime="2026-04-21 10:12:14.34719081 +0000 UTC m=+512.192386316" Apr 21 10:12:18.307255 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:18.307225 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lfmf5" Apr 21 10:12:24.036482 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:24.036451 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:24.036965 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:24.036491 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:24.041004 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:24.040980 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:24.369746 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:24.369674 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59776c4dff-zcnwx" Apr 21 10:12:35.483882 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.483849 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj"] Apr 21 10:12:35.578030 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.578000 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj"] Apr 21 10:12:35.578201 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.578101 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" Apr 21 10:12:35.581809 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.581787 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-4zvrb\"" Apr 21 10:12:35.582126 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.582104 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 10:12:35.582237 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.582104 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 10:12:35.667353 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.667324 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4qsm\" (UniqueName: \"kubernetes.io/projected/2a5f073c-57eb-4d04-8f10-84eb7a15d1bd-kube-api-access-d4qsm\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-gslrj\" (UID: \"2a5f073c-57eb-4d04-8f10-84eb7a15d1bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" Apr 21 10:12:35.667481 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.667361 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a5f073c-57eb-4d04-8f10-84eb7a15d1bd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-gslrj\" (UID: \"2a5f073c-57eb-4d04-8f10-84eb7a15d1bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" Apr 21 10:12:35.768613 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.768547 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4qsm\" (UniqueName: \"kubernetes.io/projected/2a5f073c-57eb-4d04-8f10-84eb7a15d1bd-kube-api-access-d4qsm\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-gslrj\" (UID: \"2a5f073c-57eb-4d04-8f10-84eb7a15d1bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" Apr 21 10:12:35.768613 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.768580 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a5f073c-57eb-4d04-8f10-84eb7a15d1bd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-gslrj\" (UID: \"2a5f073c-57eb-4d04-8f10-84eb7a15d1bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" Apr 21 10:12:35.768867 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.768850 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a5f073c-57eb-4d04-8f10-84eb7a15d1bd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-gslrj\" (UID: \"2a5f073c-57eb-4d04-8f10-84eb7a15d1bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" Apr 21 10:12:35.778866 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.778838 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4qsm\" (UniqueName: \"kubernetes.io/projected/2a5f073c-57eb-4d04-8f10-84eb7a15d1bd-kube-api-access-d4qsm\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-gslrj\" (UID: \"2a5f073c-57eb-4d04-8f10-84eb7a15d1bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" Apr 21 10:12:35.888783 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:35.888761 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" Apr 21 10:12:36.012975 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:36.012945 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj"] Apr 21 10:12:36.016396 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:12:36.016367 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a5f073c_57eb_4d04_8f10_84eb7a15d1bd.slice/crio-46dc64cb2a5eb07315b9d036c74f298ada0ef171b2be36a287ac58c04933471d WatchSource:0}: Error finding container 46dc64cb2a5eb07315b9d036c74f298ada0ef171b2be36a287ac58c04933471d: Status 404 returned error can't find the container with id 46dc64cb2a5eb07315b9d036c74f298ada0ef171b2be36a287ac58c04933471d Apr 21 10:12:36.411298 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:36.411270 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" event={"ID":"2a5f073c-57eb-4d04-8f10-84eb7a15d1bd","Type":"ContainerStarted","Data":"46dc64cb2a5eb07315b9d036c74f298ada0ef171b2be36a287ac58c04933471d"} Apr 21 10:12:37.522117 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:37.522083 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj"] Apr 21 10:12:37.525384 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:37.525316 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj" Apr 21 10:12:37.528373 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:37.528349 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 21 10:12:37.528516 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:37.528349 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-f2fpt\"" Apr 21 10:12:37.542319 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:37.542275 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj"] Apr 21 10:12:37.581732 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:37.581656 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fptw\" (UniqueName: \"kubernetes.io/projected/22664c87-830d-4ca8-af81-ef97e250f1d0-kube-api-access-7fptw\") pod \"dns-operator-controller-manager-844548ff4c-zwsnj\" (UID: \"22664c87-830d-4ca8-af81-ef97e250f1d0\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj" Apr 21 10:12:37.682517 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:37.682479 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fptw\" (UniqueName: \"kubernetes.io/projected/22664c87-830d-4ca8-af81-ef97e250f1d0-kube-api-access-7fptw\") pod \"dns-operator-controller-manager-844548ff4c-zwsnj\" (UID: \"22664c87-830d-4ca8-af81-ef97e250f1d0\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj" Apr 21 10:12:37.694128 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:37.694100 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fptw\" (UniqueName: \"kubernetes.io/projected/22664c87-830d-4ca8-af81-ef97e250f1d0-kube-api-access-7fptw\") pod \"dns-operator-controller-manager-844548ff4c-zwsnj\" (UID: \"22664c87-830d-4ca8-af81-ef97e250f1d0\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj" Apr 21 10:12:37.840898 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:37.840816 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj" Apr 21 10:12:38.516312 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:38.516280 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj"] Apr 21 10:12:38.519794 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:12:38.519759 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22664c87_830d_4ca8_af81_ef97e250f1d0.slice/crio-0f4e4f2d0b2d819244da9041da415e8deed5f6425bbd4843cb6c7c2b9bf47662 WatchSource:0}: Error finding container 0f4e4f2d0b2d819244da9041da415e8deed5f6425bbd4843cb6c7c2b9bf47662: Status 404 returned error can't find the container with id 0f4e4f2d0b2d819244da9041da415e8deed5f6425bbd4843cb6c7c2b9bf47662 Apr 21 10:12:39.425309 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:39.425265 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj" event={"ID":"22664c87-830d-4ca8-af81-ef97e250f1d0","Type":"ContainerStarted","Data":"0f4e4f2d0b2d819244da9041da415e8deed5f6425bbd4843cb6c7c2b9bf47662"} Apr 21 10:12:40.372058 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.371979 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-5drwh"] Apr 21 10:12:40.375915 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.375890 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-5drwh" Apr 21 10:12:40.378889 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.378842 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-2mqqw\"" Apr 21 10:12:40.387771 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.387746 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-5drwh"] Apr 21 10:12:40.407560 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.407534 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbmfn\" (UniqueName: \"kubernetes.io/projected/7388e221-d1ce-47cc-89d5-c292c3ea48c6-kube-api-access-kbmfn\") pod \"authorino-operator-7587b89b76-5drwh\" (UID: \"7388e221-d1ce-47cc-89d5-c292c3ea48c6\") " pod="kuadrant-system/authorino-operator-7587b89b76-5drwh" Apr 21 10:12:40.433517 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.433485 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" event={"ID":"2a5f073c-57eb-4d04-8f10-84eb7a15d1bd","Type":"ContainerStarted","Data":"81f5ef5be149e7abfd3ad71be358895c0399af3bfcc9b1062bdaa85727e64084"} Apr 21 10:12:40.433898 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.433800 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" Apr 21 10:12:40.463525 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.463474 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" podStartSLOduration=1.393170489 podStartE2EDuration="5.46345782s" podCreationTimestamp="2026-04-21 10:12:35 +0000 UTC" firstStartedPulling="2026-04-21 10:12:36.018764913 +0000 UTC m=+533.863960402" lastFinishedPulling="2026-04-21 10:12:40.089052245 +0000 UTC m=+537.934247733" observedRunningTime="2026-04-21 10:12:40.460570446 +0000 UTC m=+538.305765953" watchObservedRunningTime="2026-04-21 10:12:40.46345782 +0000 UTC m=+538.308653327" Apr 21 10:12:40.508195 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.508151 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbmfn\" (UniqueName: \"kubernetes.io/projected/7388e221-d1ce-47cc-89d5-c292c3ea48c6-kube-api-access-kbmfn\") pod \"authorino-operator-7587b89b76-5drwh\" (UID: \"7388e221-d1ce-47cc-89d5-c292c3ea48c6\") " pod="kuadrant-system/authorino-operator-7587b89b76-5drwh" Apr 21 10:12:40.524886 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.524849 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbmfn\" (UniqueName: \"kubernetes.io/projected/7388e221-d1ce-47cc-89d5-c292c3ea48c6-kube-api-access-kbmfn\") pod \"authorino-operator-7587b89b76-5drwh\" (UID: \"7388e221-d1ce-47cc-89d5-c292c3ea48c6\") " pod="kuadrant-system/authorino-operator-7587b89b76-5drwh" Apr 21 10:12:40.690589 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.690557 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-5drwh" Apr 21 10:12:40.861115 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:40.861089 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-5drwh"] Apr 21 10:12:41.438712 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:41.438673 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-5drwh" event={"ID":"7388e221-d1ce-47cc-89d5-c292c3ea48c6","Type":"ContainerStarted","Data":"e02d5e4f0a088ab45e59b6c1b566010ee67eec0bb13a286ba86280984d5b11a8"} Apr 21 10:12:42.445588 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:42.445546 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj" event={"ID":"22664c87-830d-4ca8-af81-ef97e250f1d0","Type":"ContainerStarted","Data":"1e5506d7aeb2f64112141299b71c43a696ce4974fc214139b14a56cbd2d7f4b9"} Apr 21 10:12:42.446033 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:42.445689 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj" Apr 21 10:12:42.478005 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:42.477960 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj" podStartSLOduration=1.9552949339999999 podStartE2EDuration="5.477945237s" podCreationTimestamp="2026-04-21 10:12:37 +0000 UTC" firstStartedPulling="2026-04-21 10:12:38.522342713 +0000 UTC m=+536.367538202" lastFinishedPulling="2026-04-21 10:12:42.044993018 +0000 UTC m=+539.890188505" observedRunningTime="2026-04-21 10:12:42.477410343 +0000 UTC m=+540.322605851" watchObservedRunningTime="2026-04-21 10:12:42.477945237 +0000 UTC m=+540.323140743" Apr 21 10:12:43.456988 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:43.456951 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-5drwh" event={"ID":"7388e221-d1ce-47cc-89d5-c292c3ea48c6","Type":"ContainerStarted","Data":"f952c9e5e69dc15f7ed0e6e11f155b3f9b86f15ab5cdac0bf0f369457a82575a"} Apr 21 10:12:43.457409 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:43.457267 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-5drwh" Apr 21 10:12:43.482229 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:43.482162 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-5drwh" podStartSLOduration=1.564521181 podStartE2EDuration="3.482148777s" podCreationTimestamp="2026-04-21 10:12:40 +0000 UTC" firstStartedPulling="2026-04-21 10:12:40.86808249 +0000 UTC m=+538.713277987" lastFinishedPulling="2026-04-21 10:12:42.785710099 +0000 UTC m=+540.630905583" observedRunningTime="2026-04-21 10:12:43.480637669 +0000 UTC m=+541.325833177" watchObservedRunningTime="2026-04-21 10:12:43.482148777 +0000 UTC m=+541.327344282" Apr 21 10:12:51.441081 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:51.441052 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-gslrj" Apr 21 10:12:53.459267 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:53.459235 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-zwsnj" Apr 21 10:12:54.462857 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:12:54.462822 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-5drwh" Apr 21 10:13:42.672882 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:13:42.672851 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:13:42.673966 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:13:42.673942 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:18:42.703581 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:18:42.703553 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:18:42.704790 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:18:42.704764 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:23:42.728380 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:23:42.728356 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:23:42.730328 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:23:42.730306 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:23:44.348780 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:23:44.348753 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-gslrj_2a5f073c-57eb-4d04-8f10-84eb7a15d1bd/manager/0.log" Apr 21 10:24:01.887760 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:01.887731 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-7cd77c7ffd-svbn4_71ff2aa4-7476-46c1-ba0a-8a36acc8bc81/discovery/0.log" Apr 21 10:24:02.523016 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:02.522983 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-5drwh_7388e221-d1ce-47cc-89d5-c292c3ea48c6/manager/0.log" Apr 21 10:24:02.535975 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:02.535947 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-zwsnj_22664c87-830d-4ca8-af81-ef97e250f1d0/manager/0.log" Apr 21 10:24:02.580644 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:02.580620 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-gslrj_2a5f073c-57eb-4d04-8f10-84eb7a15d1bd/manager/0.log" Apr 21 10:24:06.786459 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.786422 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f55k6/must-gather-4czfd"] Apr 21 10:24:06.789903 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.789884 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f55k6/must-gather-4czfd" Apr 21 10:24:06.792255 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.792233 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f55k6\"/\"kube-root-ca.crt\"" Apr 21 10:24:06.793140 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.793123 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f55k6\"/\"default-dockercfg-nx89h\"" Apr 21 10:24:06.793242 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.793224 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f55k6\"/\"openshift-service-ca.crt\"" Apr 21 10:24:06.796776 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.796751 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f55k6/must-gather-4czfd"] Apr 21 10:24:06.838745 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.838721 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eff8400-dd80-4927-b447-eab619b9c0b6-must-gather-output\") pod \"must-gather-4czfd\" (UID: \"2eff8400-dd80-4927-b447-eab619b9c0b6\") " pod="openshift-must-gather-f55k6/must-gather-4czfd" Apr 21 10:24:06.838843 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.838768 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brxmx\" (UniqueName: \"kubernetes.io/projected/2eff8400-dd80-4927-b447-eab619b9c0b6-kube-api-access-brxmx\") pod \"must-gather-4czfd\" (UID: \"2eff8400-dd80-4927-b447-eab619b9c0b6\") " pod="openshift-must-gather-f55k6/must-gather-4czfd" Apr 21 10:24:06.939493 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.939470 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eff8400-dd80-4927-b447-eab619b9c0b6-must-gather-output\") pod \"must-gather-4czfd\" (UID: \"2eff8400-dd80-4927-b447-eab619b9c0b6\") " pod="openshift-must-gather-f55k6/must-gather-4czfd" Apr 21 10:24:06.939669 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.939514 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brxmx\" (UniqueName: \"kubernetes.io/projected/2eff8400-dd80-4927-b447-eab619b9c0b6-kube-api-access-brxmx\") pod \"must-gather-4czfd\" (UID: \"2eff8400-dd80-4927-b447-eab619b9c0b6\") " pod="openshift-must-gather-f55k6/must-gather-4czfd" Apr 21 10:24:06.939802 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.939785 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eff8400-dd80-4927-b447-eab619b9c0b6-must-gather-output\") pod \"must-gather-4czfd\" (UID: \"2eff8400-dd80-4927-b447-eab619b9c0b6\") " pod="openshift-must-gather-f55k6/must-gather-4czfd" Apr 21 10:24:06.947893 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:06.947875 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brxmx\" (UniqueName: \"kubernetes.io/projected/2eff8400-dd80-4927-b447-eab619b9c0b6-kube-api-access-brxmx\") pod \"must-gather-4czfd\" (UID: \"2eff8400-dd80-4927-b447-eab619b9c0b6\") " pod="openshift-must-gather-f55k6/must-gather-4czfd" Apr 21 10:24:07.101077 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:07.101020 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f55k6/must-gather-4czfd" Apr 21 10:24:07.221520 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:07.221475 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f55k6/must-gather-4czfd"] Apr 21 10:24:07.223661 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:24:07.223631 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eff8400_dd80_4927_b447_eab619b9c0b6.slice/crio-1b602fb3e1da27699dc8454b6959a996878cb7bd3e5d8ee1da35dbc862297f09 WatchSource:0}: Error finding container 1b602fb3e1da27699dc8454b6959a996878cb7bd3e5d8ee1da35dbc862297f09: Status 404 returned error can't find the container with id 1b602fb3e1da27699dc8454b6959a996878cb7bd3e5d8ee1da35dbc862297f09 Apr 21 10:24:07.225558 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:07.225542 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:24:07.927818 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:07.927781 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f55k6/must-gather-4czfd" event={"ID":"2eff8400-dd80-4927-b447-eab619b9c0b6","Type":"ContainerStarted","Data":"1b602fb3e1da27699dc8454b6959a996878cb7bd3e5d8ee1da35dbc862297f09"} Apr 21 10:24:08.934075 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:08.933839 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f55k6/must-gather-4czfd" event={"ID":"2eff8400-dd80-4927-b447-eab619b9c0b6","Type":"ContainerStarted","Data":"b4aabcce7211d14ebc20b759e01ed1480be76173ed780f59ca1ce15fdd5a3a2a"} Apr 21 10:24:08.934075 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:08.933877 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f55k6/must-gather-4czfd" event={"ID":"2eff8400-dd80-4927-b447-eab619b9c0b6","Type":"ContainerStarted","Data":"7d23725ca72f86f79bb7438abd998c548e8891c3a91851f0698813b255026de9"} Apr 21 10:24:08.949874 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:08.949823 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f55k6/must-gather-4czfd" podStartSLOduration=2.103694582 podStartE2EDuration="2.949807902s" podCreationTimestamp="2026-04-21 10:24:06 +0000 UTC" firstStartedPulling="2026-04-21 10:24:07.22573481 +0000 UTC m=+1225.070930297" lastFinishedPulling="2026-04-21 10:24:08.071848118 +0000 UTC m=+1225.917043617" observedRunningTime="2026-04-21 10:24:08.948460477 +0000 UTC m=+1226.793655981" watchObservedRunningTime="2026-04-21 10:24:08.949807902 +0000 UTC m=+1226.795003409" Apr 21 10:24:09.567846 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:09.567798 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mg976_79560f0b-3eeb-4ad8-9eca-ac25ba5bf424/global-pull-secret-syncer/0.log" Apr 21 10:24:09.625974 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:09.625942 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-hk9xb_a892f931-ea13-4698-b14e-4a1b739f586c/konnectivity-agent/0.log" Apr 21 10:24:09.721121 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:09.721096 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-231.ec2.internal_3d4599cdcd53cd71a013b6e6939ddb90/haproxy/0.log" Apr 21 10:24:12.890876 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:12.890847 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-5drwh_7388e221-d1ce-47cc-89d5-c292c3ea48c6/manager/0.log" Apr 21 10:24:12.925036 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:12.925000 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-zwsnj_22664c87-830d-4ca8-af81-ef97e250f1d0/manager/0.log" Apr 21 10:24:12.986672 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:12.986646 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-gslrj_2a5f073c-57eb-4d04-8f10-84eb7a15d1bd/manager/0.log" Apr 21 10:24:14.214488 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.214460 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24b93fc8-5b72-4c3a-9ce5-31878a449724/alertmanager/0.log" Apr 21 10:24:14.237048 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.237021 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24b93fc8-5b72-4c3a-9ce5-31878a449724/config-reloader/0.log" Apr 21 10:24:14.262381 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.262350 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24b93fc8-5b72-4c3a-9ce5-31878a449724/kube-rbac-proxy-web/0.log" Apr 21 10:24:14.292529 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.292466 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24b93fc8-5b72-4c3a-9ce5-31878a449724/kube-rbac-proxy/0.log" Apr 21 10:24:14.314355 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.314326 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24b93fc8-5b72-4c3a-9ce5-31878a449724/kube-rbac-proxy-metric/0.log" Apr 21 10:24:14.339438 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.339412 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24b93fc8-5b72-4c3a-9ce5-31878a449724/prom-label-proxy/0.log" Apr 21 10:24:14.360704 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.360677 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_24b93fc8-5b72-4c3a-9ce5-31878a449724/init-config-reloader/0.log" Apr 21 10:24:14.422692 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.422657 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4chhb_852ca942-92ad-4073-8c15-89263d3beac6/kube-state-metrics/0.log" Apr 21 10:24:14.443811 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.443770 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4chhb_852ca942-92ad-4073-8c15-89263d3beac6/kube-rbac-proxy-main/0.log" Apr 21 10:24:14.464372 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.464317 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4chhb_852ca942-92ad-4073-8c15-89263d3beac6/kube-rbac-proxy-self/0.log" Apr 21 10:24:14.492968 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.492886 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-8577976cdf-42wrd_7ae78a88-873e-47c5-96e2-71e175ad6366/metrics-server/0.log" Apr 21 10:24:14.695431 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.695401 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pjrhj_05e986e2-925b-4f4c-a251-7c43d0600377/node-exporter/0.log" Apr 21 10:24:14.718082 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.718052 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pjrhj_05e986e2-925b-4f4c-a251-7c43d0600377/kube-rbac-proxy/0.log" Apr 21 10:24:14.741367 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.741343 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pjrhj_05e986e2-925b-4f4c-a251-7c43d0600377/init-textfile/0.log" Apr 21 10:24:14.780939 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.780808 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n842d_452d9e5b-6dd7-474f-94d3-175d4f3f8fcf/kube-rbac-proxy-main/0.log" Apr 21 10:24:14.805702 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.805668 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n842d_452d9e5b-6dd7-474f-94d3-175d4f3f8fcf/kube-rbac-proxy-self/0.log" Apr 21 10:24:14.831698 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:14.831668 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-n842d_452d9e5b-6dd7-474f-94d3-175d4f3f8fcf/openshift-state-metrics/0.log" Apr 21 10:24:15.025850 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:15.025820 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gxg5w_fff653d0-ff47-486a-af54-9c141f939ade/prometheus-operator/0.log" Apr 21 10:24:15.045800 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:15.045722 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gxg5w_fff653d0-ff47-486a-af54-9c141f939ade/kube-rbac-proxy/0.log" Apr 21 10:24:17.593591 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:17.593563 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59776c4dff-zcnwx_52e96a96-11c1-4086-96f6-904e0bc06745/console/0.log" Apr 21 10:24:18.203822 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.203788 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9"] Apr 21 10:24:18.210969 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.210942 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.213803 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.213774 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9"] Apr 21 10:24:18.352821 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.352780 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-podres\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.352821 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.352816 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-sys\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.353032 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.352853 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-lib-modules\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.353032 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.352922 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-proc\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.353032 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.352939 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq8mr\" (UniqueName: \"kubernetes.io/projected/d5ab4dba-9375-46df-b85b-89c9556440ec-kube-api-access-mq8mr\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.454393 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.454320 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-proc\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.454393 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.454355 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mq8mr\" (UniqueName: \"kubernetes.io/projected/d5ab4dba-9375-46df-b85b-89c9556440ec-kube-api-access-mq8mr\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.454393 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.454379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-podres\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.454393 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.454399 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-sys\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.454652 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.454433 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-lib-modules\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.454652 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.454455 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-proc\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.454652 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.454520 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-sys\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.454652 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.454560 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-lib-modules\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.454652 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.454560 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5ab4dba-9375-46df-b85b-89c9556440ec-podres\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.462440 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.462410 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq8mr\" (UniqueName: \"kubernetes.io/projected/d5ab4dba-9375-46df-b85b-89c9556440ec-kube-api-access-mq8mr\") pod \"perf-node-gather-daemonset-gwtx9\" (UID: \"d5ab4dba-9375-46df-b85b-89c9556440ec\") " pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.525943 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.525910 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.656156 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.654338 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9"] Apr 21 10:24:18.659001 ip-10-0-140-231 kubenswrapper[2567]: W0421 10:24:18.658954 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd5ab4dba_9375_46df_b85b_89c9556440ec.slice/crio-26bb45755dd52e4200bb95c0452044cb0401b49f4aed95addbb68ee42c0eaf72 WatchSource:0}: Error finding container 26bb45755dd52e4200bb95c0452044cb0401b49f4aed95addbb68ee42c0eaf72: Status 404 returned error can't find the container with id 26bb45755dd52e4200bb95c0452044cb0401b49f4aed95addbb68ee42c0eaf72 Apr 21 10:24:18.907436 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.907408 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sznwb_a413cb28-d70b-44b6-a527-03a5247fa66a/dns/0.log" Apr 21 10:24:18.926990 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.926969 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sznwb_a413cb28-d70b-44b6-a527-03a5247fa66a/kube-rbac-proxy/0.log" Apr 21 10:24:18.981556 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.981491 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" event={"ID":"d5ab4dba-9375-46df-b85b-89c9556440ec","Type":"ContainerStarted","Data":"95ed3f102e55697ca038d61916fd43070c9d6a90deec6d805f0cee02b52e9164"} Apr 21 10:24:18.981556 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.981524 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" event={"ID":"d5ab4dba-9375-46df-b85b-89c9556440ec","Type":"ContainerStarted","Data":"26bb45755dd52e4200bb95c0452044cb0401b49f4aed95addbb68ee42c0eaf72"} Apr 21 10:24:18.981667 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.981560 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:18.990426 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.990407 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s97qn_22eb8c40-ac2f-42ed-897c-2c7e11b8588c/dns-node-resolver/0.log" Apr 21 10:24:18.998518 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:18.998481 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" podStartSLOduration=0.998470099 podStartE2EDuration="998.470099ms" podCreationTimestamp="2026-04-21 10:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:24:18.996439138 +0000 UTC m=+1236.841634644" watchObservedRunningTime="2026-04-21 10:24:18.998470099 +0000 UTC m=+1236.843665604" Apr 21 10:24:19.466304 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:19.466273 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wgkxz_f5719a9a-0eff-48f5-b634-e4d0a7216828/node-ca/0.log" Apr 21 10:24:20.185100 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:20.185029 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-7cd77c7ffd-svbn4_71ff2aa4-7476-46c1-ba0a-8a36acc8bc81/discovery/0.log" Apr 21 10:24:20.661919 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:20.661892 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-v9jx5_2857b675-4470-427f-a3d7-94390418dee9/serve-healthcheck-canary/0.log" Apr 21 10:24:21.204864 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:21.204840 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wp4c9_39147b7c-8d16-4801-8f60-a0cc5afd65e4/kube-rbac-proxy/0.log" Apr 21 10:24:21.224241 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:21.224216 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wp4c9_39147b7c-8d16-4801-8f60-a0cc5afd65e4/exporter/0.log" Apr 21 10:24:21.243745 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:21.243724 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wp4c9_39147b7c-8d16-4801-8f60-a0cc5afd65e4/extractor/0.log" Apr 21 10:24:23.309430 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:23.309399 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-j6xkb_9dab3e7b-4b04-4a4d-8226-ec361f24b1c5/openshift-lws-operator/0.log" Apr 21 10:24:24.994960 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:24.994932 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f55k6/perf-node-gather-daemonset-gwtx9" Apr 21 10:24:26.810065 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:26.810037 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-nxvlf_8a442b1e-fc34-4b46-94e0-354888afd597/migrator/0.log" Apr 21 10:24:26.828597 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:26.828575 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-nxvlf_8a442b1e-fc34-4b46-94e0-354888afd597/graceful-termination/0.log" Apr 21 10:24:28.349306 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:28.349280 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rnvsp_1ba26a6f-298a-4d59-9fa5-4f65cc1729c9/kube-multus-additional-cni-plugins/0.log" Apr 21 10:24:28.369283 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:28.369264 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rnvsp_1ba26a6f-298a-4d59-9fa5-4f65cc1729c9/egress-router-binary-copy/0.log" Apr 21 10:24:28.388398 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:28.388382 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rnvsp_1ba26a6f-298a-4d59-9fa5-4f65cc1729c9/cni-plugins/0.log" Apr 21 10:24:28.408897 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:28.408880 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rnvsp_1ba26a6f-298a-4d59-9fa5-4f65cc1729c9/bond-cni-plugin/0.log" Apr 21 10:24:28.428332 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:28.428318 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rnvsp_1ba26a6f-298a-4d59-9fa5-4f65cc1729c9/routeoverride-cni/0.log" Apr 21 10:24:28.448251 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:28.448233 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rnvsp_1ba26a6f-298a-4d59-9fa5-4f65cc1729c9/whereabouts-cni-bincopy/0.log" Apr 21 10:24:28.468365 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:28.468346 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rnvsp_1ba26a6f-298a-4d59-9fa5-4f65cc1729c9/whereabouts-cni/0.log" Apr 21 10:24:28.838784 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:28.838759 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bnl68_f531049d-f18b-4c01-9df7-a6c394430f98/kube-multus/0.log" Apr 21 10:24:28.974001 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:28.973967 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nwsw4_dbb00fc1-1258-4254-a360-3c350554925b/network-metrics-daemon/0.log" Apr 21 10:24:28.993218 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:28.993159 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nwsw4_dbb00fc1-1258-4254-a360-3c350554925b/kube-rbac-proxy/0.log" Apr 21 10:24:30.169306 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:30.169256 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-controller/0.log" Apr 21 10:24:30.186506 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:30.186485 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/0.log" Apr 21 10:24:30.192490 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:30.192466 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovn-acl-logging/1.log" Apr 21 10:24:30.211340 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:30.211314 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/kube-rbac-proxy-node/0.log" Apr 21 10:24:30.230809 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:30.230783 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 10:24:30.247309 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:30.247286 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/northd/0.log" Apr 21 10:24:30.267453 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:30.267426 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/nbdb/0.log" Apr 21 10:24:30.287138 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:30.287112 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/sbdb/0.log" Apr 21 10:24:30.385744 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:30.385712 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tqctk_2bdbce72-3342-4c5d-9e2f-6757d506d268/ovnkube-controller/0.log" Apr 21 10:24:31.652393 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:31.652361 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-svqbc_592d16c7-dbbe-4301-a523-7a9d396a1b51/check-endpoints/0.log" Apr 21 10:24:31.696969 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:31.696946 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jzrcz_5f39fe1c-70f8-4445-8cfd-646cb496d498/network-check-target-container/0.log" Apr 21 10:24:32.678236 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:32.678211 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-jzz4r_76f0dead-e62b-424c-a40e-6f82d52ba722/iptables-alerter/0.log" Apr 21 10:24:33.376114 ip-10-0-140-231 kubenswrapper[2567]: I0421 10:24:33.376086 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-jj5fc_7db2250c-0db7-4d5b-8890-c0dcb9a1171d/tuned/0.log"