Apr 20 23:09:53.093194 ip-10-0-137-139 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 23:09:53.093205 ip-10-0-137-139 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 23:09:53.093214 ip-10-0-137-139 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 23:09:53.093525 ip-10-0-137-139 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 23:10:03.181344 ip-10-0-137-139 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 23:10:03.181365 ip-10-0-137-139 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7fadf92e40154101ae905763d65eb9cc -- Apr 20 23:11:59.903437 ip-10-0-137-139 systemd[1]: Starting Kubernetes Kubelet... Apr 20 23:12:00.445018 ip-10-0-137-139 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 23:12:00.445018 ip-10-0-137-139 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 23:12:00.445018 ip-10-0-137-139 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 23:12:00.445018 ip-10-0-137-139 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 23:12:00.445018 ip-10-0-137-139 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 23:12:00.446943 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.446850 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 23:12:00.453384 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453369 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:00.453384 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453384 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453388 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453392 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453395 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453398 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453401 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453404 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453407 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453411 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453415 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453419 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453422 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453424 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453428 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453431 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453434 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453437 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453440 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453442 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:00.453446 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453445 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453448 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453451 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453453 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453456 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453459 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453462 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453465 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453467 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453470 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453472 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453474 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453477 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453480 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453482 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453485 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453488 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453491 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453494 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:00.453904 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453496 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453499 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453501 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453504 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453506 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453508 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453511 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453513 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453516 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453518 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453522 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453524 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453527 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453530 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453533 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453537 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453540 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453542 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453545 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453549 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:00.454382 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453553 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453557 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453560 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453563 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453565 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453568 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453570 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453579 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453582 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453585 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453587 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453592 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453595 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453598 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453601 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453603 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453606 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453608 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453611 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453613 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:00.454902 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453616 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453618 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453621 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453623 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453626 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453629 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.453631 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454045 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454050 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454053 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454056 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454058 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454061 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454064 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454066 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454069 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454072 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454074 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454077 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454086 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:00.455389 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454089 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454092 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454095 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454097 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454100 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454102 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454105 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454108 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454111 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454113 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454116 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454118 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454121 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454123 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454126 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454128 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454131 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454133 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454152 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454155 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:00.455914 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454158 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454160 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454163 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454166 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454169 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454172 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454174 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454176 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454179 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454182 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454184 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454187 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454196 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454198 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454201 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454203 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454205 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454208 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454211 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:00.456418 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454213 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454217 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454221 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454224 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454227 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454230 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454232 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454235 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454237 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454240 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454245 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454248 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454251 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454254 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454256 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454259 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454262 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454265 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454267 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:00.456917 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454270 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454272 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454275 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454277 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454280 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454282 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454285 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454292 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454295 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454298 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454301 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454305 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454308 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454310 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.454313 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454386 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454393 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454401 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454408 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454414 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454419 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 23:12:00.457398 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454424 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454429 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454433 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454436 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454439 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454442 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454446 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454449 2576 flags.go:64] FLAG: --cgroup-root="" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454452 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454455 2576 flags.go:64] FLAG: --client-ca-file="" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454458 2576 flags.go:64] FLAG: --cloud-config="" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454461 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454464 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454469 2576 flags.go:64] FLAG: --cluster-domain="" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454472 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454475 2576 flags.go:64] FLAG: --config-dir="" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454478 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454482 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454485 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454490 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454493 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454497 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454500 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454503 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 23:12:00.457910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454505 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454508 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454511 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454515 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454518 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454521 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454524 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454527 2576 flags.go:64] FLAG: --enable-server="true" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454530 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454534 2576 flags.go:64] FLAG: --event-burst="100" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454538 2576 flags.go:64] FLAG: --event-qps="50" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454541 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454544 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454547 2576 flags.go:64] FLAG: --eviction-hard="" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454551 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454554 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454558 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454561 2576 flags.go:64] FLAG: --eviction-soft="" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454564 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454567 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454570 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454573 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454576 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454579 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454581 2576 flags.go:64] FLAG: --feature-gates="" Apr 20 23:12:00.458487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454585 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454588 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454591 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454595 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454598 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454602 2576 flags.go:64] FLAG: --help="false" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454605 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454608 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454611 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454614 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454617 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454621 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454624 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454627 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454629 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454632 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454635 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454638 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454641 2576 flags.go:64] FLAG: --kube-reserved="" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454644 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454647 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454650 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454666 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454670 2576 flags.go:64] FLAG: --lock-file="" Apr 20 23:12:00.459085 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454673 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454676 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454679 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454685 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454688 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454691 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454694 2576 flags.go:64] FLAG: --logging-format="text" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454696 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454700 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454703 2576 flags.go:64] FLAG: --manifest-url="" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454706 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454714 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454718 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454722 2576 flags.go:64] FLAG: --max-pods="110" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454725 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454728 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454731 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454734 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454737 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454740 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454743 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454779 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454784 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454788 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 23:12:00.459998 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454791 2576 flags.go:64] FLAG: --pod-cidr="" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454795 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454801 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454805 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454808 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454811 2576 flags.go:64] FLAG: --port="10250" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454815 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454818 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a16c246769e2331c" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454822 2576 flags.go:64] FLAG: --qos-reserved="" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454825 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454828 2576 flags.go:64] FLAG: --register-node="true" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454831 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454834 2576 flags.go:64] FLAG: --register-with-taints="" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454844 2576 flags.go:64] FLAG: --registry-burst="10" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454847 2576 flags.go:64] FLAG: --registry-qps="5" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454850 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454853 2576 flags.go:64] FLAG: --reserved-memory="" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454857 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454860 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454863 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454866 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454869 2576 flags.go:64] FLAG: --runonce="false" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454871 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454875 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454878 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 20 23:12:00.460623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454881 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454884 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454887 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454890 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454897 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454901 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454904 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454907 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454910 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454914 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454917 2576 flags.go:64] FLAG: --system-cgroups="" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454920 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454927 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454930 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454933 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454937 2576 flags.go:64] FLAG: --tls-min-version="" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454940 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454943 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454946 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454949 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454952 2576 flags.go:64] FLAG: --v="2" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454957 2576 flags.go:64] FLAG: --version="false" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454961 2576 flags.go:64] FLAG: --vmodule="" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454966 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.454969 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 23:12:00.461239 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455068 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455072 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455075 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455078 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455081 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455084 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455087 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455090 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455092 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455095 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455098 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455102 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455105 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455108 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455111 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455113 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455116 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455119 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455122 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:00.461840 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455124 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455127 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455129 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455132 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455152 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455155 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455158 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455162 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455166 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455169 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455172 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455175 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455177 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455180 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455183 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455185 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455188 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455191 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455193 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455196 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:00.462309 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455198 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455201 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455204 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455207 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455211 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455213 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455216 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455220 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455224 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455227 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455231 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455234 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455237 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455240 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455242 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455245 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455247 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455250 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455253 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:00.462821 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455255 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455257 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455260 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455262 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455265 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455267 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455270 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455273 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455275 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455278 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455280 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455283 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455285 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455288 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455290 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455293 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455296 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455300 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455303 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455306 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:00.463318 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455308 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455311 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455314 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455316 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455318 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455321 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455324 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.455326 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.455336 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.462653 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.462670 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462741 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462746 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462750 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462753 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462756 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:00.463820 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462759 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462763 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462765 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462768 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462771 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462774 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462777 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462779 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462782 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462785 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462788 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462790 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462793 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462795 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462798 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462814 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462819 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462828 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462831 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462834 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:00.464326 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462837 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462840 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462843 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462847 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462850 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462853 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462856 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462859 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462862 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462865 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462867 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462870 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462873 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462877 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462880 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462883 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462886 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462888 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462891 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:00.464818 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462894 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462898 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462902 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462905 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462908 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462911 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462914 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462916 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462919 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462922 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462925 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462935 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462938 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462940 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462943 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462946 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462948 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462951 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462953 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:00.465298 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462957 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462959 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462962 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462964 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462967 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462969 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462972 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462974 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462976 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462979 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462981 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462984 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462986 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462989 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462991 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462994 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462996 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.462999 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463001 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:00.465782 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463004 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463006 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463009 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463011 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.463016 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463161 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463173 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463177 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463180 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463183 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463185 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463188 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463191 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463194 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463196 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:00.466299 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463199 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463201 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463204 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463206 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463208 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463211 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463213 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463216 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463218 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463221 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463224 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463228 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463231 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463235 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463238 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463241 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463244 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463247 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463249 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:00.466664 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463252 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463255 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463263 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463266 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463268 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463276 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463279 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463281 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463283 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463286 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463289 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463292 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463294 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463296 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463299 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463302 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463304 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463307 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463309 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463312 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:00.467113 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463314 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463317 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463319 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463322 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463325 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463328 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463330 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463333 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463335 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463338 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463340 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463343 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463345 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463348 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463351 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463354 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463356 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463359 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463367 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463370 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:00.467641 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463372 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463375 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463378 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463380 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463383 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463385 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463388 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463390 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463393 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463395 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463397 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463400 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463402 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463405 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463407 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463409 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:00.468123 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:00.463412 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:00.468656 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.463416 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 23:12:00.468656 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.464245 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 23:12:00.468656 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.466275 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 23:12:00.468656 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.467543 2576 server.go:1019] "Starting client certificate rotation" Apr 20 23:12:00.468656 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.467640 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 23:12:00.468656 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.467684 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 23:12:00.502859 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.502834 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 23:12:00.508284 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.508268 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 23:12:00.524696 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.524668 2576 log.go:25] "Validated CRI v1 runtime API" Apr 20 23:12:00.531485 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.531463 2576 log.go:25] "Validated CRI v1 image API" Apr 20 23:12:00.531598 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.531579 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 23:12:00.532691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.532669 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 23:12:00.535875 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.535857 2576 fs.go:135] Filesystem UUIDs: map[4805e315-9a7d-4097-8a94-91917de3144a:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 d566a7a2-0066-4f7b-9403-4db256bb5386:/dev/nvme0n1p4] Apr 20 23:12:00.535931 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.535875 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 23:12:00.540857 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.540737 2576 manager.go:217] Machine: {Timestamp:2026-04-20 23:12:00.539469012 +0000 UTC m=+0.484178459 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3074844 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec247e8b3e6134883cf3a5ac29ae41a8 SystemUUID:ec247e8b-3e61-3488-3cf3-a5ac29ae41a8 BootID:7fadf92e-4015-4101-ae90-5763d65eb9cc Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f9:c2:04:6a:11 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f9:c2:04:6a:11 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:72:a7:52:82:58 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 23:12:00.540857 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.540852 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 23:12:00.540984 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.540972 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 23:12:00.544286 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.544262 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 23:12:00.544421 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.544289 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-139.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 23:12:00.544467 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.544429 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 23:12:00.544467 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.544439 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 23:12:00.544467 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.544452 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 23:12:00.545481 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.545471 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 23:12:00.546636 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.546627 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 20 23:12:00.546742 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.546733 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 23:12:00.549722 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.549711 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 20 23:12:00.549759 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.549726 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 23:12:00.549759 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.549741 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 23:12:00.549759 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.549753 2576 kubelet.go:397] "Adding apiserver pod source" Apr 20 23:12:00.549838 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.549762 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 23:12:00.550946 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.550933 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 23:12:00.550993 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.550958 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 23:12:00.554442 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.554428 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 23:12:00.555974 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.555957 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 23:12:00.557743 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557729 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 23:12:00.557786 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557755 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 23:12:00.557786 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557768 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 23:12:00.557786 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557777 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 23:12:00.557866 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557793 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 23:12:00.557866 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557836 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 23:12:00.557866 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557842 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 23:12:00.557866 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557849 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 23:12:00.557866 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557857 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 23:12:00.557866 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557865 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 23:12:00.558013 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557880 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 23:12:00.558013 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.557889 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 23:12:00.558842 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.558829 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 23:12:00.558842 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.558840 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 23:12:00.561581 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.561544 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-139.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 23:12:00.561654 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.561570 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 23:12:00.561751 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.561739 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-139.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:12:00.563167 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.563155 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 23:12:00.563203 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.563191 2576 server.go:1295] "Started kubelet" Apr 20 23:12:00.563347 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.563312 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 23:12:00.563441 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.563288 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 23:12:00.563494 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.563449 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 23:12:00.563857 ip-10-0-137-139 systemd[1]: Started Kubernetes Kubelet. Apr 20 23:12:00.564913 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.564891 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 23:12:00.567495 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.567473 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 20 23:12:00.572462 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.572334 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qgtrp" Apr 20 23:12:00.573183 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.573162 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 23:12:00.573183 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.573176 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 23:12:00.573788 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.573773 2576 factory.go:55] Registering systemd factory Apr 20 23:12:00.573857 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.573791 2576 factory.go:223] Registration of the systemd container factory successfully Apr 20 23:12:00.573857 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.573796 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 23:12:00.573857 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.573812 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 23:12:00.573857 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.573851 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 23:12:00.574030 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.573933 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 20 23:12:00.574030 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.573941 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 20 23:12:00.574030 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.573991 2576 factory.go:153] Registering CRI-O factory Apr 20 23:12:00.574030 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.574001 2576 factory.go:223] Registration of the crio container factory successfully Apr 20 23:12:00.574205 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.574069 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:00.574205 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.574087 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 23:12:00.574205 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.574128 2576 factory.go:103] Registering Raw factory Apr 20 23:12:00.574205 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.574161 2576 manager.go:1196] Started watching for new ooms in manager Apr 20 23:12:00.574529 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.574491 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 23:12:00.574638 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.574625 2576 manager.go:319] Starting recovery of all containers Apr 20 23:12:00.579668 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.579647 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qgtrp" Apr 20 23:12:00.583737 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.583709 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-139.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 23:12:00.583861 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.583751 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 23:12:00.584900 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.583616 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-139.ec2.internal.18a83389f21cc375 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-139.ec2.internal,UID:ip-10-0-137-139.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-139.ec2.internal,},FirstTimestamp:2026-04-20 23:12:00.563168117 +0000 UTC m=+0.507877564,LastTimestamp:2026-04-20 23:12:00.563168117 +0000 UTC m=+0.507877564,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-139.ec2.internal,}" Apr 20 23:12:00.586601 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.586587 2576 manager.go:324] Recovery completed Apr 20 23:12:00.590765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.590753 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:00.593169 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.593155 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:00.593242 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.593184 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:00.593242 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.593195 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:00.593677 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.593661 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 23:12:00.593677 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.593670 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 23:12:00.593763 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.593686 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 20 23:12:00.595814 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.595803 2576 policy_none.go:49] "None policy: Start" Apr 20 23:12:00.595855 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.595818 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 23:12:00.595855 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.595828 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.632735 2576 manager.go:341] "Starting Device Plugin manager" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.632783 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.632796 2576 server.go:85] "Starting device plugin registration server" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.633078 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.633091 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.633180 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.633270 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.633279 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.633860 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.633892 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.634382 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.635578 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.635604 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.635621 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.635628 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 23:12:00.636853 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.635661 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 23:12:00.638779 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.638763 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:00.733920 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.733835 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:00.734858 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.734836 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:00.734984 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.734878 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:00.734984 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.734893 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:00.734984 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.734923 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.735926 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.735908 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-139.ec2.internal"] Apr 20 23:12:00.735981 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.735973 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:00.736743 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.736728 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:00.736809 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.736757 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:00.736809 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.736769 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:00.738073 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.738061 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:00.738218 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.738205 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.738258 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.738234 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:00.738839 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.738822 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:00.738903 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.738850 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:00.738903 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.738862 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:00.738966 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.738824 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:00.738966 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.738929 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:00.738966 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.738940 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:00.740032 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.740017 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.740095 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.740049 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:00.740659 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.740638 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:00.740659 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.740657 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:00.740831 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.740669 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:00.744448 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.744433 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.744509 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.744455 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-139.ec2.internal\": node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:00.761848 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.761831 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-139.ec2.internal\" not found" node="ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.765900 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.765884 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:00.766092 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.766080 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-139.ec2.internal\" not found" node="ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.774778 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.774756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff565a32ff02ffe7c9262acb131f7f92-config\") pod \"kube-apiserver-proxy-ip-10-0-137-139.ec2.internal\" (UID: \"ff565a32ff02ffe7c9262acb131f7f92\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.774861 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.774782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a4dcc3e36b50576b6bf701bd98ded776-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal\" (UID: \"a4dcc3e36b50576b6bf701bd98ded776\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.774861 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.774799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4dcc3e36b50576b6bf701bd98ded776-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal\" (UID: \"a4dcc3e36b50576b6bf701bd98ded776\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.866372 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.866334 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:00.875618 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.875594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4dcc3e36b50576b6bf701bd98ded776-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal\" (UID: \"a4dcc3e36b50576b6bf701bd98ded776\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.875703 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.875625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff565a32ff02ffe7c9262acb131f7f92-config\") pod \"kube-apiserver-proxy-ip-10-0-137-139.ec2.internal\" (UID: \"ff565a32ff02ffe7c9262acb131f7f92\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.875703 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.875643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a4dcc3e36b50576b6bf701bd98ded776-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal\" (UID: \"a4dcc3e36b50576b6bf701bd98ded776\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.875703 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.875697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4dcc3e36b50576b6bf701bd98ded776-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal\" (UID: \"a4dcc3e36b50576b6bf701bd98ded776\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.875827 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.875705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a4dcc3e36b50576b6bf701bd98ded776-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal\" (UID: \"a4dcc3e36b50576b6bf701bd98ded776\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.875827 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:00.875724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ff565a32ff02ffe7c9262acb131f7f92-config\") pod \"kube-apiserver-proxy-ip-10-0-137-139.ec2.internal\" (UID: \"ff565a32ff02ffe7c9262acb131f7f92\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-139.ec2.internal" Apr 20 23:12:00.966989 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:00.966944 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:01.064847 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.064760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" Apr 20 23:12:01.067360 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:01.067343 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:01.069507 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.069495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-139.ec2.internal" Apr 20 23:12:01.167499 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:01.167456 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:01.268066 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:01.268033 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:01.368605 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:01.368526 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:01.467187 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.467167 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 23:12:01.467725 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.467320 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 23:12:01.469264 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:01.469242 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:01.569736 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:01.569716 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:01.574064 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.574040 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 23:12:01.576266 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:01.576232 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff565a32ff02ffe7c9262acb131f7f92.slice/crio-d8bb4446f79f2bb962cb6c0b6f236da01118513968df69d9ab4ae96db98182a3 WatchSource:0}: Error finding container d8bb4446f79f2bb962cb6c0b6f236da01118513968df69d9ab4ae96db98182a3: Status 404 returned error can't find the container with id d8bb4446f79f2bb962cb6c0b6f236da01118513968df69d9ab4ae96db98182a3 Apr 20 23:12:01.581420 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.581405 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 23:12:01.583067 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.583041 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 23:07:00 +0000 UTC" deadline="2028-01-14 20:09:47.398954136 +0000 UTC" Apr 20 23:12:01.583162 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.583067 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15212h57m45.815890782s" Apr 20 23:12:01.584267 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.584249 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 23:12:01.615474 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.615442 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fc4ll" Apr 20 23:12:01.625361 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.625312 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fc4ll" Apr 20 23:12:01.637125 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:01.637090 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4dcc3e36b50576b6bf701bd98ded776.slice/crio-acebdab578cefae87e946059860bad12b39a7d5d77c0a67e64896902e37cbbc1 WatchSource:0}: Error finding container acebdab578cefae87e946059860bad12b39a7d5d77c0a67e64896902e37cbbc1: Status 404 returned error can't find the container with id acebdab578cefae87e946059860bad12b39a7d5d77c0a67e64896902e37cbbc1 Apr 20 23:12:01.638702 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.638634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-139.ec2.internal" event={"ID":"ff565a32ff02ffe7c9262acb131f7f92","Type":"ContainerStarted","Data":"d8bb4446f79f2bb962cb6c0b6f236da01118513968df69d9ab4ae96db98182a3"} Apr 20 23:12:01.670786 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:01.670757 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:01.771301 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:01.771264 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-139.ec2.internal\" not found" Apr 20 23:12:01.848099 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.848073 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:01.850155 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.850112 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:01.873795 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.873777 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" Apr 20 23:12:01.886342 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.886296 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 23:12:01.888330 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.888317 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-139.ec2.internal" Apr 20 23:12:01.897307 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.897291 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 23:12:01.972062 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:01.971306 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:02.551666 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.551630 2576 apiserver.go:52] "Watching apiserver" Apr 20 23:12:02.560552 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.560518 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 23:12:02.560971 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.560942 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jnjdb","openshift-network-diagnostics/network-check-target-plw6c","openshift-network-operator/iptables-alerter-mmw6l","openshift-ovn-kubernetes/ovnkube-node-6675l","kube-system/konnectivity-agent-9pvbx","openshift-cluster-node-tuning-operator/tuned-jmbhc","openshift-dns/node-resolver-jsb2w","openshift-image-registry/node-ca-s6hls","openshift-multus/network-metrics-daemon-rvb5h","kube-system/kube-apiserver-proxy-ip-10-0-137-139.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal","openshift-multus/multus-additional-cni-plugins-qdvqj"] Apr 20 23:12:02.563988 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.563965 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jsb2w" Apr 20 23:12:02.566361 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.566342 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4f2gz\"" Apr 20 23:12:02.566458 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.566362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.566661 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.566641 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 23:12:02.566733 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.566680 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 23:12:02.568532 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.568514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:02.569825 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.569707 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 23:12:02.570249 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.570226 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vbkp7\"" Apr 20 23:12:02.570339 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.570259 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 23:12:02.571073 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.570820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.571073 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.570945 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 23:12:02.571232 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.571219 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 23:12:02.571232 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.571225 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 23:12:02.571444 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.571424 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 23:12:02.571709 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.571644 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jht7f\"" Apr 20 23:12:02.571785 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.571752 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 23:12:02.571940 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.571869 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 23:12:02.572904 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.572734 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 23:12:02.574134 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.573257 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 23:12:02.574134 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.573711 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 23:12:02.574134 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.573722 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 23:12:02.574320 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.574241 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qhtjr\"" Apr 20 23:12:02.576316 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.575971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s6hls" Apr 20 23:12:02.576316 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.576075 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.578470 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.578453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 23:12:02.578649 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.578634 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 23:12:02.578884 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.578868 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 23:12:02.579049 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.579035 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 23:12:02.579370 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.579349 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 23:12:02.579454 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.579389 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gxzb9\"" Apr 20 23:12:02.579577 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.579563 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 23:12:02.579686 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.579664 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jhf8q\"" Apr 20 23:12:02.580672 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.580594 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.581208 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.581168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:02.582325 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:02.581895 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:02.584742 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.584585 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 23:12:02.584742 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.584725 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 23:12:02.584939 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.584803 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vx8r6\"" Apr 20 23:12:02.585287 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.585287 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-registration-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.585287 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-device-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.585287 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-systemd-units\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.585287 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-run-systemd\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.585287 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.585287 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65a2a89a-c0bf-4140-bd21-e8249221ca05-env-overrides\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.585287 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7577b13a-1450-42fb-aa2f-4374c2a72406-cni-binary-copy\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5rfx\" (UniqueName: \"kubernetes.io/projected/9aff2cd3-cf36-4053-81c6-5808527407bf-kube-api-access-z5rfx\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/392462d6-20a2-4842-bcbc-129ba91961ef-hosts-file\") pod \"node-resolver-jsb2w\" (UID: \"392462d6-20a2-4842-bcbc-129ba91961ef\") " pod="openshift-dns/node-resolver-jsb2w" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/392462d6-20a2-4842-bcbc-129ba91961ef-tmp-dir\") pod \"node-resolver-jsb2w\" (UID: \"392462d6-20a2-4842-bcbc-129ba91961ef\") " pod="openshift-dns/node-resolver-jsb2w" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-run-ovn-kubernetes\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-system-cni-dir\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-conf-dir\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-node-log\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65a2a89a-c0bf-4140-bd21-e8249221ca05-ovnkube-script-lib\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjcsr\" (UniqueName: \"kubernetes.io/projected/65a2a89a-c0bf-4140-bd21-e8249221ca05-kube-api-access-wjcsr\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c89ec650-7091-4fbe-a329-ba849fc5e589-agent-certs\") pod \"konnectivity-agent-9pvbx\" (UID: \"c89ec650-7091-4fbe-a329-ba849fc5e589\") " pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-os-release\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-run-k8s-cni-cncf-io\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585633 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-run-netns\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.585691 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585655 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-run-multus-certs\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsckc\" (UniqueName: \"kubernetes.io/projected/392462d6-20a2-4842-bcbc-129ba91961ef-kube-api-access-tsckc\") pod \"node-resolver-jsb2w\" (UID: \"392462d6-20a2-4842-bcbc-129ba91961ef\") " pod="openshift-dns/node-resolver-jsb2w" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-kubelet\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-etc-openvswitch\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-cnibin\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-var-lib-cni-bin\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-hostroot\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585897 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5t92\" (UniqueName: \"kubernetes.io/projected/1652a387-e617-4579-bb1d-4fab03dacaed-kube-api-access-b5t92\") pod \"node-ca-s6hls\" (UID: \"1652a387-e617-4579-bb1d-4fab03dacaed\") " pod="openshift-image-registry/node-ca-s6hls" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-sys-fs\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.585978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-slash\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-run-netns\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-var-lib-kubelet\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-etc-kubernetes\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-cni-netd\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65a2a89a-c0bf-4140-bd21-e8249221ca05-ovnkube-config\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c89ec650-7091-4fbe-a329-ba849fc5e589-konnectivity-ca\") pod \"konnectivity-agent-9pvbx\" (UID: \"c89ec650-7091-4fbe-a329-ba849fc5e589\") " pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-socket-dir-parent\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.586266 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-daemon-config\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-etc-selinux\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586277 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-var-lib-openvswitch\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-log-socket\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586349 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-cni-bin\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-cni-dir\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-var-lib-cni-multus\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1652a387-e617-4579-bb1d-4fab03dacaed-host\") pod \"node-ca-s6hls\" (UID: \"1652a387-e617-4579-bb1d-4fab03dacaed\") " pod="openshift-image-registry/node-ca-s6hls" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1652a387-e617-4579-bb1d-4fab03dacaed-serviceca\") pod \"node-ca-s6hls\" (UID: \"1652a387-e617-4579-bb1d-4fab03dacaed\") " pod="openshift-image-registry/node-ca-s6hls" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-socket-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-run-openvswitch\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-run-ovn\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65a2a89a-c0bf-4140-bd21-e8249221ca05-ovn-node-metrics-cert\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.586880 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.586643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffrsc\" (UniqueName: \"kubernetes.io/projected/7577b13a-1450-42fb-aa2f-4374c2a72406-kube-api-access-ffrsc\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.587323 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.587031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.589223 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.589203 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:12:02.589517 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.589497 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:02.589517 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.589513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mmw6l" Apr 20 23:12:02.589648 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.589557 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 23:12:02.589648 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:02.589600 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:02.589807 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.589786 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-cljzk\"" Apr 20 23:12:02.592354 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.592031 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 23:12:02.592354 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.592055 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:12:02.592354 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.592076 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 23:12:02.592354 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.592169 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rsfgq\"" Apr 20 23:12:02.605901 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.605878 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:02.625905 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.625877 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 23:07:01 +0000 UTC" deadline="2028-01-29 22:18:36.609540814 +0000 UTC" Apr 20 23:12:02.626000 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.625964 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15575h6m33.983581565s" Apr 20 23:12:02.640654 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.640616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" event={"ID":"a4dcc3e36b50576b6bf701bd98ded776","Type":"ContainerStarted","Data":"acebdab578cefae87e946059860bad12b39a7d5d77c0a67e64896902e37cbbc1"} Apr 20 23:12:02.674724 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.674698 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 23:12:02.687703 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-cni-netd\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.687860 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c89ec650-7091-4fbe-a329-ba849fc5e589-konnectivity-ca\") pod \"konnectivity-agent-9pvbx\" (UID: \"c89ec650-7091-4fbe-a329-ba849fc5e589\") " pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:02.687860 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-socket-dir-parent\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.687860 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-etc-selinux\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.687860 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-var-lib-cni-multus\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.687860 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-cni-netd\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.687860 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687849 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-etc-selinux\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.687860 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-socket-dir-parent\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-socket-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-run-openvswitch\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-registration-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687921 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-cnibin\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-socket-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-modprobe-d\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-sysctl-conf\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-tuned\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-var-lib-cni-multus\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swxl6\" (UniqueName: \"kubernetes.io/projected/e6e1d353-f530-4ad5-a0ae-b436e227eb58-kube-api-access-swxl6\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-run-openvswitch\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-run-systemd\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-run-systemd\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65a2a89a-c0bf-4140-bd21-e8249221ca05-env-overrides\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.687996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-registration-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.688233 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7577b13a-1450-42fb-aa2f-4374c2a72406-cni-binary-copy\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/392462d6-20a2-4842-bcbc-129ba91961ef-hosts-file\") pod \"node-resolver-jsb2w\" (UID: \"392462d6-20a2-4842-bcbc-129ba91961ef\") " pod="openshift-dns/node-resolver-jsb2w" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-system-cni-dir\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-conf-dir\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688228 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-host\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65a2a89a-c0bf-4140-bd21-e8249221ca05-ovnkube-script-lib\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c89ec650-7091-4fbe-a329-ba849fc5e589-agent-certs\") pod \"konnectivity-agent-9pvbx\" (UID: \"c89ec650-7091-4fbe-a329-ba849fc5e589\") " pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-run-netns\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5t92\" (UniqueName: \"kubernetes.io/projected/1652a387-e617-4579-bb1d-4fab03dacaed-kube-api-access-b5t92\") pod \"node-ca-s6hls\" (UID: \"1652a387-e617-4579-bb1d-4fab03dacaed\") " pod="openshift-image-registry/node-ca-s6hls" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688362 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-run\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688391 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-lib-modules\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c89ec650-7091-4fbe-a329-ba849fc5e589-konnectivity-ca\") pod \"konnectivity-agent-9pvbx\" (UID: \"c89ec650-7091-4fbe-a329-ba849fc5e589\") " pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldcrn\" (UniqueName: \"kubernetes.io/projected/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-kube-api-access-ldcrn\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-var-lib-cni-bin\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/97d5b486-1141-4ce1-b800-263ccf62a8cd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688479 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688482 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-run-netns\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.688996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-slash\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688555 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-var-lib-cni-bin\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7577b13a-1450-42fb-aa2f-4374c2a72406-cni-binary-copy\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688834 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688849 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-conf-dir\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/392462d6-20a2-4842-bcbc-129ba91961ef-hosts-file\") pod \"node-resolver-jsb2w\" (UID: \"392462d6-20a2-4842-bcbc-129ba91961ef\") " pod="openshift-dns/node-resolver-jsb2w" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688945 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-run-netns\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688986 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/97d5b486-1141-4ce1-b800-263ccf62a8cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65a2a89a-c0bf-4140-bd21-e8249221ca05-ovnkube-config\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-daemon-config\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65a2a89a-c0bf-4140-bd21-e8249221ca05-ovnkube-script-lib\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-slash\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-run-netns\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-system-cni-dir\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65a2a89a-c0bf-4140-bd21-e8249221ca05-env-overrides\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.688838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-system-cni-dir\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx5qt\" (UniqueName: \"kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt\") pod \"network-check-target-plw6c\" (UID: \"4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98\") " pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:02.689765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-sysconfig\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-var-lib-openvswitch\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-log-socket\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-cni-bin\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-cni-dir\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-daemon-config\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-log-socket\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-cni-bin\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65a2a89a-c0bf-4140-bd21-e8249221ca05-ovnkube-config\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1652a387-e617-4579-bb1d-4fab03dacaed-host\") pod \"node-ca-s6hls\" (UID: \"1652a387-e617-4579-bb1d-4fab03dacaed\") " pod="openshift-image-registry/node-ca-s6hls" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-var-lib-openvswitch\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-multus-cni-dir\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1652a387-e617-4579-bb1d-4fab03dacaed-host\") pod \"node-ca-s6hls\" (UID: \"1652a387-e617-4579-bb1d-4fab03dacaed\") " pod="openshift-image-registry/node-ca-s6hls" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1652a387-e617-4579-bb1d-4fab03dacaed-serviceca\") pod \"node-ca-s6hls\" (UID: \"1652a387-e617-4579-bb1d-4fab03dacaed\") " pod="openshift-image-registry/node-ca-s6hls" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-device-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689733 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-kubernetes\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-device-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-run-ovn\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.690537 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65a2a89a-c0bf-4140-bd21-e8249221ca05-ovn-node-metrics-cert\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffrsc\" (UniqueName: \"kubernetes.io/projected/7577b13a-1450-42fb-aa2f-4374c2a72406-kube-api-access-ffrsc\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-run-ovn\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689944 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97d5b486-1141-4ce1-b800-263ccf62a8cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689990 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.689995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-systemd\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-systemd-units\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-systemd-units\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5rfx\" (UniqueName: \"kubernetes.io/projected/9aff2cd3-cf36-4053-81c6-5808527407bf-kube-api-access-z5rfx\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690110 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-os-release\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-tmp\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21bb60a5-8c2e-4d57-bf71-b44519293c10-host-slash\") pod \"iptables-alerter-mmw6l\" (UID: \"21bb60a5-8c2e-4d57-bf71-b44519293c10\") " pod="openshift-network-operator/iptables-alerter-mmw6l" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/392462d6-20a2-4842-bcbc-129ba91961ef-tmp-dir\") pod \"node-resolver-jsb2w\" (UID: \"392462d6-20a2-4842-bcbc-129ba91961ef\") " pod="openshift-dns/node-resolver-jsb2w" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-run-ovn-kubernetes\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/21bb60a5-8c2e-4d57-bf71-b44519293c10-iptables-alerter-script\") pod \"iptables-alerter-mmw6l\" (UID: \"21bb60a5-8c2e-4d57-bf71-b44519293c10\") " pod="openshift-network-operator/iptables-alerter-mmw6l" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-node-log\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690328 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-run-ovn-kubernetes\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjcsr\" (UniqueName: \"kubernetes.io/projected/65a2a89a-c0bf-4140-bd21-e8249221ca05-kube-api-access-wjcsr\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-os-release\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-run-k8s-cni-cncf-io\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-run-multus-certs\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glclv\" (UniqueName: \"kubernetes.io/projected/97d5b486-1141-4ce1-b800-263ccf62a8cd-kube-api-access-glclv\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-sysctl-d\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsckc\" (UniqueName: \"kubernetes.io/projected/392462d6-20a2-4842-bcbc-129ba91961ef-kube-api-access-tsckc\") pod \"node-resolver-jsb2w\" (UID: \"392462d6-20a2-4842-bcbc-129ba91961ef\") " pod="openshift-dns/node-resolver-jsb2w" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-kubelet\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1652a387-e617-4579-bb1d-4fab03dacaed-serviceca\") pod \"node-ca-s6hls\" (UID: \"1652a387-e617-4579-bb1d-4fab03dacaed\") " pod="openshift-image-registry/node-ca-s6hls" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-etc-openvswitch\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-cnibin\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-os-release\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690625 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-run-k8s-cni-cncf-io\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-node-log\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-host-kubelet\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-hostroot\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.691805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65a2a89a-c0bf-4140-bd21-e8249221ca05-etc-openvswitch\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-sys-fs\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-hostroot\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-sys\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690755 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-var-lib-kubelet\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690768 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9aff2cd3-cf36-4053-81c6-5808527407bf-sys-fs\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-var-lib-kubelet\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-etc-kubernetes\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-cnibin\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690840 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csjcj\" (UniqueName: \"kubernetes.io/projected/21bb60a5-8c2e-4d57-bf71-b44519293c10-kube-api-access-csjcj\") pod \"iptables-alerter-mmw6l\" (UID: \"21bb60a5-8c2e-4d57-bf71-b44519293c10\") " pod="openshift-network-operator/iptables-alerter-mmw6l" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690860 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-run-multus-certs\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-host-var-lib-kubelet\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.690894 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7577b13a-1450-42fb-aa2f-4374c2a72406-etc-kubernetes\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.691169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/392462d6-20a2-4842-bcbc-129ba91961ef-tmp-dir\") pod \"node-resolver-jsb2w\" (UID: \"392462d6-20a2-4842-bcbc-129ba91961ef\") " pod="openshift-dns/node-resolver-jsb2w" Apr 20 23:12:02.692452 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.692308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c89ec650-7091-4fbe-a329-ba849fc5e589-agent-certs\") pod \"konnectivity-agent-9pvbx\" (UID: \"c89ec650-7091-4fbe-a329-ba849fc5e589\") " pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:02.692924 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.692480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65a2a89a-c0bf-4140-bd21-e8249221ca05-ovn-node-metrics-cert\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.699314 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.699242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjcsr\" (UniqueName: \"kubernetes.io/projected/65a2a89a-c0bf-4140-bd21-e8249221ca05-kube-api-access-wjcsr\") pod \"ovnkube-node-6675l\" (UID: \"65a2a89a-c0bf-4140-bd21-e8249221ca05\") " pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.699587 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.699545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsckc\" (UniqueName: \"kubernetes.io/projected/392462d6-20a2-4842-bcbc-129ba91961ef-kube-api-access-tsckc\") pod \"node-resolver-jsb2w\" (UID: \"392462d6-20a2-4842-bcbc-129ba91961ef\") " pod="openshift-dns/node-resolver-jsb2w" Apr 20 23:12:02.699896 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.699867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffrsc\" (UniqueName: \"kubernetes.io/projected/7577b13a-1450-42fb-aa2f-4374c2a72406-kube-api-access-ffrsc\") pod \"multus-jnjdb\" (UID: \"7577b13a-1450-42fb-aa2f-4374c2a72406\") " pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.699970 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.699898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5rfx\" (UniqueName: \"kubernetes.io/projected/9aff2cd3-cf36-4053-81c6-5808527407bf-kube-api-access-z5rfx\") pod \"aws-ebs-csi-driver-node-8t59g\" (UID: \"9aff2cd3-cf36-4053-81c6-5808527407bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.701014 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.700996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5t92\" (UniqueName: \"kubernetes.io/projected/1652a387-e617-4579-bb1d-4fab03dacaed-kube-api-access-b5t92\") pod \"node-ca-s6hls\" (UID: \"1652a387-e617-4579-bb1d-4fab03dacaed\") " pod="openshift-image-registry/node-ca-s6hls" Apr 20 23:12:02.791586 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-os-release\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.791746 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-tmp\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.791746 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21bb60a5-8c2e-4d57-bf71-b44519293c10-host-slash\") pod \"iptables-alerter-mmw6l\" (UID: \"21bb60a5-8c2e-4d57-bf71-b44519293c10\") " pod="openshift-network-operator/iptables-alerter-mmw6l" Apr 20 23:12:02.791746 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/21bb60a5-8c2e-4d57-bf71-b44519293c10-iptables-alerter-script\") pod \"iptables-alerter-mmw6l\" (UID: \"21bb60a5-8c2e-4d57-bf71-b44519293c10\") " pod="openshift-network-operator/iptables-alerter-mmw6l" Apr 20 23:12:02.791746 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.791746 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glclv\" (UniqueName: \"kubernetes.io/projected/97d5b486-1141-4ce1-b800-263ccf62a8cd-kube-api-access-glclv\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.791746 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-os-release\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.791746 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-sysctl-d\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21bb60a5-8c2e-4d57-bf71-b44519293c10-host-slash\") pod \"iptables-alerter-mmw6l\" (UID: \"21bb60a5-8c2e-4d57-bf71-b44519293c10\") " pod="openshift-network-operator/iptables-alerter-mmw6l" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-sys\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-var-lib-kubelet\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csjcj\" (UniqueName: \"kubernetes.io/projected/21bb60a5-8c2e-4d57-bf71-b44519293c10-kube-api-access-csjcj\") pod \"iptables-alerter-mmw6l\" (UID: \"21bb60a5-8c2e-4d57-bf71-b44519293c10\") " pod="openshift-network-operator/iptables-alerter-mmw6l" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-cnibin\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791860 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-sysctl-d\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-modprobe-d\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-sysctl-conf\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791909 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-tuned\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swxl6\" (UniqueName: \"kubernetes.io/projected/e6e1d353-f530-4ad5-a0ae-b436e227eb58-kube-api-access-swxl6\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.791964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-host\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-run\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-lib-modules\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldcrn\" (UniqueName: \"kubernetes.io/projected/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-kube-api-access-ldcrn\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/97d5b486-1141-4ce1-b800-263ccf62a8cd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.792071 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/97d5b486-1141-4ce1-b800-263ccf62a8cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792125 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-system-cni-dir\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792193 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-sys\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-cnibin\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792238 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-var-lib-kubelet\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792286 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-run\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5qt\" (UniqueName: \"kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt\") pod \"network-check-target-plw6c\" (UID: \"4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98\") " pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-lib-modules\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-sysconfig\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-modprobe-d\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-kubernetes\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/21bb60a5-8c2e-4d57-bf71-b44519293c10-iptables-alerter-script\") pod \"iptables-alerter-mmw6l\" (UID: \"21bb60a5-8c2e-4d57-bf71-b44519293c10\") " pod="openshift-network-operator/iptables-alerter-mmw6l" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792468 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-sysconfig\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792470 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-kubernetes\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-host\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-sysctl-conf\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:02.792589 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:02.792816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97d5b486-1141-4ce1-b800-263ccf62a8cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.793727 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:02.792889 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs podName:e6e1d353-f530-4ad5-a0ae-b436e227eb58 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:03.29286775 +0000 UTC m=+3.237577187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs") pod "network-metrics-daemon-rvb5h" (UID: "e6e1d353-f530-4ad5-a0ae-b436e227eb58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:02.793727 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-systemd\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.793727 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.792914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97d5b486-1141-4ce1-b800-263ccf62a8cd-system-cni-dir\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.793727 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.793000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-systemd\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.793727 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.793205 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/97d5b486-1141-4ce1-b800-263ccf62a8cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.793727 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.793280 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97d5b486-1141-4ce1-b800-263ccf62a8cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.793727 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.793296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/97d5b486-1141-4ce1-b800-263ccf62a8cd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.794239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.794213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-tmp\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.794607 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.794589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-etc-tuned\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.801717 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:02.801673 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:02.801717 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:02.801696 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:02.801717 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:02.801708 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mx5qt for pod openshift-network-diagnostics/network-check-target-plw6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:02.801925 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:02.801774 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt podName:4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:03.30175542 +0000 UTC m=+3.246464850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mx5qt" (UniqueName: "kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt") pod "network-check-target-plw6c" (UID: "4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:02.803826 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.803799 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glclv\" (UniqueName: \"kubernetes.io/projected/97d5b486-1141-4ce1-b800-263ccf62a8cd-kube-api-access-glclv\") pod \"multus-additional-cni-plugins-qdvqj\" (UID: \"97d5b486-1141-4ce1-b800-263ccf62a8cd\") " pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.803967 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.803940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldcrn\" (UniqueName: \"kubernetes.io/projected/99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0-kube-api-access-ldcrn\") pod \"tuned-jmbhc\" (UID: \"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0\") " pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.804215 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.804199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csjcj\" (UniqueName: \"kubernetes.io/projected/21bb60a5-8c2e-4d57-bf71-b44519293c10-kube-api-access-csjcj\") pod \"iptables-alerter-mmw6l\" (UID: \"21bb60a5-8c2e-4d57-bf71-b44519293c10\") " pod="openshift-network-operator/iptables-alerter-mmw6l" Apr 20 23:12:02.804304 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.804279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swxl6\" (UniqueName: \"kubernetes.io/projected/e6e1d353-f530-4ad5-a0ae-b436e227eb58-kube-api-access-swxl6\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:02.877841 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.877759 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jsb2w" Apr 20 23:12:02.888666 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.888642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:02.896679 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.896658 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:02.904273 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.904253 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jnjdb" Apr 20 23:12:02.912892 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.912871 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s6hls" Apr 20 23:12:02.919560 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.919541 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" Apr 20 23:12:02.928216 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.928189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" Apr 20 23:12:02.936887 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.936848 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" Apr 20 23:12:02.944637 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:02.944611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mmw6l" Apr 20 23:12:03.297291 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.296109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:03.297291 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:03.296269 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:03.297291 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:03.296367 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs podName:e6e1d353-f530-4ad5-a0ae-b436e227eb58 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:04.29634423 +0000 UTC m=+4.241053676 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs") pod "network-metrics-daemon-rvb5h" (UID: "e6e1d353-f530-4ad5-a0ae-b436e227eb58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:03.397178 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.397128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5qt\" (UniqueName: \"kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt\") pod \"network-check-target-plw6c\" (UID: \"4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98\") " pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:03.397324 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:03.397269 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:03.397324 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:03.397284 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:03.397324 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:03.397293 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mx5qt for pod openshift-network-diagnostics/network-check-target-plw6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:03.397476 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:03.397351 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt podName:4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:04.397338906 +0000 UTC m=+4.342048335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mx5qt" (UniqueName: "kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt") pod "network-check-target-plw6c" (UID: "4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:03.487526 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:03.487308 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aff2cd3_cf36_4053_81c6_5808527407bf.slice/crio-cbc7e86c35a11737d31831a97e30997184d9a322b8f7bdf2d02900c22f58caea WatchSource:0}: Error finding container cbc7e86c35a11737d31831a97e30997184d9a322b8f7bdf2d02900c22f58caea: Status 404 returned error can't find the container with id cbc7e86c35a11737d31831a97e30997184d9a322b8f7bdf2d02900c22f58caea Apr 20 23:12:03.494501 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:03.494460 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65a2a89a_c0bf_4140_bd21_e8249221ca05.slice/crio-e479866d4ae01de9753db2d459f08fd6769f18a7e55446fb887cce370f3f2e2e WatchSource:0}: Error finding container e479866d4ae01de9753db2d459f08fd6769f18a7e55446fb887cce370f3f2e2e: Status 404 returned error can't find the container with id e479866d4ae01de9753db2d459f08fd6769f18a7e55446fb887cce370f3f2e2e Apr 20 23:12:03.515006 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:03.514983 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7577b13a_1450_42fb_aa2f_4374c2a72406.slice/crio-0689e3832abd9c225236f79d1a32b56ccd67a2c91ca188d5a2e081c0e813d900 WatchSource:0}: Error finding container 0689e3832abd9c225236f79d1a32b56ccd67a2c91ca188d5a2e081c0e813d900: Status 404 returned error can't find the container with id 0689e3832abd9c225236f79d1a32b56ccd67a2c91ca188d5a2e081c0e813d900 Apr 20 23:12:03.516536 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:03.516447 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b18490_2395_4c7c_9ba0_f7c7b0ab7ee0.slice/crio-3844deef31c2b608579983f7137c575440ed615184a2342a1839a90d6e8d9157 WatchSource:0}: Error finding container 3844deef31c2b608579983f7137c575440ed615184a2342a1839a90d6e8d9157: Status 404 returned error can't find the container with id 3844deef31c2b608579983f7137c575440ed615184a2342a1839a90d6e8d9157 Apr 20 23:12:03.517134 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:03.517061 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89ec650_7091_4fbe_a329_ba849fc5e589.slice/crio-3e155b72afc0cffd79632c0380f9e5b50a5ca25c964e39791bde363bbc652aa3 WatchSource:0}: Error finding container 3e155b72afc0cffd79632c0380f9e5b50a5ca25c964e39791bde363bbc652aa3: Status 404 returned error can't find the container with id 3e155b72afc0cffd79632c0380f9e5b50a5ca25c964e39791bde363bbc652aa3 Apr 20 23:12:03.518839 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:03.518742 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1652a387_e617_4579_bb1d_4fab03dacaed.slice/crio-34d15738e2bcac7c2f2973a69792d9741fc311b2f1e88ac84d1b2832ba72541d WatchSource:0}: Error finding container 34d15738e2bcac7c2f2973a69792d9741fc311b2f1e88ac84d1b2832ba72541d: Status 404 returned error can't find the container with id 34d15738e2bcac7c2f2973a69792d9741fc311b2f1e88ac84d1b2832ba72541d Apr 20 23:12:03.519421 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:03.519088 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d5b486_1141_4ce1_b800_263ccf62a8cd.slice/crio-312087531da985a1bdc9936b244470117d5294f61e296fb146cdc977b014b536 WatchSource:0}: Error finding container 312087531da985a1bdc9936b244470117d5294f61e296fb146cdc977b014b536: Status 404 returned error can't find the container with id 312087531da985a1bdc9936b244470117d5294f61e296fb146cdc977b014b536 Apr 20 23:12:03.519421 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:03.519336 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod392462d6_20a2_4842_bcbc_129ba91961ef.slice/crio-983033ac77204d52f801c1c199a19c2fd37b62640a897d7a76bb6a85314af215 WatchSource:0}: Error finding container 983033ac77204d52f801c1c199a19c2fd37b62640a897d7a76bb6a85314af215: Status 404 returned error can't find the container with id 983033ac77204d52f801c1c199a19c2fd37b62640a897d7a76bb6a85314af215 Apr 20 23:12:03.626245 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.626215 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 23:07:01 +0000 UTC" deadline="2027-10-26 15:56:29.575184673 +0000 UTC" Apr 20 23:12:03.626245 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.626242 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13288h44m25.948944763s" Apr 20 23:12:03.636599 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.636581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:03.636694 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:03.636677 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:03.642894 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.642866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" event={"ID":"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0","Type":"ContainerStarted","Data":"3844deef31c2b608579983f7137c575440ed615184a2342a1839a90d6e8d9157"} Apr 20 23:12:03.643795 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.643762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnjdb" event={"ID":"7577b13a-1450-42fb-aa2f-4374c2a72406","Type":"ContainerStarted","Data":"0689e3832abd9c225236f79d1a32b56ccd67a2c91ca188d5a2e081c0e813d900"} Apr 20 23:12:03.644755 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.644734 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" event={"ID":"9aff2cd3-cf36-4053-81c6-5808527407bf","Type":"ContainerStarted","Data":"cbc7e86c35a11737d31831a97e30997184d9a322b8f7bdf2d02900c22f58caea"} Apr 20 23:12:03.646104 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.646084 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-139.ec2.internal" event={"ID":"ff565a32ff02ffe7c9262acb131f7f92","Type":"ContainerStarted","Data":"54075165a3e98d6981b50c74099133fd01d97722ed0fe65b57dd92a86899f633"} Apr 20 23:12:03.646986 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.646968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jsb2w" event={"ID":"392462d6-20a2-4842-bcbc-129ba91961ef","Type":"ContainerStarted","Data":"983033ac77204d52f801c1c199a19c2fd37b62640a897d7a76bb6a85314af215"} Apr 20 23:12:03.647852 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.647833 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s6hls" event={"ID":"1652a387-e617-4579-bb1d-4fab03dacaed","Type":"ContainerStarted","Data":"34d15738e2bcac7c2f2973a69792d9741fc311b2f1e88ac84d1b2832ba72541d"} Apr 20 23:12:03.648688 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.648665 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" event={"ID":"97d5b486-1141-4ce1-b800-263ccf62a8cd","Type":"ContainerStarted","Data":"312087531da985a1bdc9936b244470117d5294f61e296fb146cdc977b014b536"} Apr 20 23:12:03.649618 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.649594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9pvbx" event={"ID":"c89ec650-7091-4fbe-a329-ba849fc5e589","Type":"ContainerStarted","Data":"3e155b72afc0cffd79632c0380f9e5b50a5ca25c964e39791bde363bbc652aa3"} Apr 20 23:12:03.650453 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.650437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" event={"ID":"65a2a89a-c0bf-4140-bd21-e8249221ca05","Type":"ContainerStarted","Data":"e479866d4ae01de9753db2d459f08fd6769f18a7e55446fb887cce370f3f2e2e"} Apr 20 23:12:03.651251 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.651234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mmw6l" event={"ID":"21bb60a5-8c2e-4d57-bf71-b44519293c10","Type":"ContainerStarted","Data":"a5db98e3bee7b89496f713b20ccf9e5aa4f565adc697c08f0650e79179d42e13"} Apr 20 23:12:03.659634 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:03.659596 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-139.ec2.internal" podStartSLOduration=2.659585591 podStartE2EDuration="2.659585591s" podCreationTimestamp="2026-04-20 23:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:12:03.659569364 +0000 UTC m=+3.604278815" watchObservedRunningTime="2026-04-20 23:12:03.659585591 +0000 UTC m=+3.604295041" Apr 20 23:12:04.304801 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:04.304255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:04.304801 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:04.304418 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:04.304801 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:04.304478 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs podName:e6e1d353-f530-4ad5-a0ae-b436e227eb58 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:06.304458637 +0000 UTC m=+6.249168071 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs") pod "network-metrics-daemon-rvb5h" (UID: "e6e1d353-f530-4ad5-a0ae-b436e227eb58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:04.405274 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:04.405234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5qt\" (UniqueName: \"kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt\") pod \"network-check-target-plw6c\" (UID: \"4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98\") " pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:04.405440 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:04.405430 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:04.405514 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:04.405450 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:04.405514 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:04.405464 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mx5qt for pod openshift-network-diagnostics/network-check-target-plw6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:04.405619 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:04.405531 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt podName:4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:06.405511804 +0000 UTC m=+6.350221256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mx5qt" (UniqueName: "kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt") pod "network-check-target-plw6c" (UID: "4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:04.638362 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:04.638196 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:04.638362 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:04.638334 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:04.664860 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:04.664816 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4dcc3e36b50576b6bf701bd98ded776" containerID="0f211ce49d1f6bb74f37277f9065ab2a3773e6609c1fd63559585ce1b0089013" exitCode=0 Apr 20 23:12:04.665760 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:04.665730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" event={"ID":"a4dcc3e36b50576b6bf701bd98ded776","Type":"ContainerDied","Data":"0f211ce49d1f6bb74f37277f9065ab2a3773e6609c1fd63559585ce1b0089013"} Apr 20 23:12:05.636515 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:05.636472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:05.636701 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:05.636612 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:05.686044 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:05.685991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" event={"ID":"a4dcc3e36b50576b6bf701bd98ded776","Type":"ContainerStarted","Data":"640a1e8e0d7cbb024aa241df96205104a796388e9ca59991110aa901b1c99422"} Apr 20 23:12:05.700936 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:05.699966 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-139.ec2.internal" podStartSLOduration=4.699949318 podStartE2EDuration="4.699949318s" podCreationTimestamp="2026-04-20 23:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:12:05.699768864 +0000 UTC m=+5.644478319" watchObservedRunningTime="2026-04-20 23:12:05.699949318 +0000 UTC m=+5.644658769" Apr 20 23:12:06.320454 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:06.320408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:06.320655 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:06.320569 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:06.320655 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:06.320635 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs podName:e6e1d353-f530-4ad5-a0ae-b436e227eb58 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:10.320615457 +0000 UTC m=+10.265324888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs") pod "network-metrics-daemon-rvb5h" (UID: "e6e1d353-f530-4ad5-a0ae-b436e227eb58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:06.421980 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:06.421382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5qt\" (UniqueName: \"kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt\") pod \"network-check-target-plw6c\" (UID: \"4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98\") " pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:06.421980 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:06.421547 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:06.421980 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:06.421566 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:06.421980 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:06.421579 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mx5qt for pod openshift-network-diagnostics/network-check-target-plw6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:06.421980 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:06.421636 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt podName:4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:10.421619195 +0000 UTC m=+10.366328639 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mx5qt" (UniqueName: "kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt") pod "network-check-target-plw6c" (UID: "4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:06.636992 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:06.636919 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:06.637163 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:06.637033 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:07.636770 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:07.636737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:07.637180 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:07.636882 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:08.636296 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:08.636255 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:08.636468 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:08.636397 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:09.636582 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:09.636540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:09.637034 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:09.636683 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:10.360290 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:10.360248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:10.360492 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:10.360471 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:10.360579 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:10.360548 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs podName:e6e1d353-f530-4ad5-a0ae-b436e227eb58 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:18.360525387 +0000 UTC m=+18.305234818 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs") pod "network-metrics-daemon-rvb5h" (UID: "e6e1d353-f530-4ad5-a0ae-b436e227eb58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:10.460986 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:10.460947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5qt\" (UniqueName: \"kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt\") pod \"network-check-target-plw6c\" (UID: \"4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98\") " pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:10.461164 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:10.461116 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:10.461164 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:10.461147 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:10.461164 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:10.461161 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mx5qt for pod openshift-network-diagnostics/network-check-target-plw6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:10.461301 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:10.461226 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt podName:4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:18.461207687 +0000 UTC m=+18.405917121 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mx5qt" (UniqueName: "kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt") pod "network-check-target-plw6c" (UID: "4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:10.645307 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:10.645232 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:10.645728 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:10.645377 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:11.637496 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:11.637354 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:11.637496 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:11.637489 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:12.636087 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:12.636047 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:12.636543 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:12.636170 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:13.635966 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:13.635935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:13.636183 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:13.636073 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:14.636644 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:14.636614 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:14.637072 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:14.636738 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:15.635825 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:15.635793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:15.636000 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:15.635908 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:16.636711 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:16.636625 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:16.637096 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:16.636764 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:17.636669 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:17.636638 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:17.636872 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:17.636744 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:18.417481 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:18.417412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:18.417704 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:18.417616 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:18.417768 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:18.417714 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs podName:e6e1d353-f530-4ad5-a0ae-b436e227eb58 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:34.417691603 +0000 UTC m=+34.362401044 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs") pod "network-metrics-daemon-rvb5h" (UID: "e6e1d353-f530-4ad5-a0ae-b436e227eb58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:18.518190 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:18.518153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5qt\" (UniqueName: \"kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt\") pod \"network-check-target-plw6c\" (UID: \"4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98\") " pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:18.518370 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:18.518342 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:18.518370 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:18.518368 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:18.518559 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:18.518381 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mx5qt for pod openshift-network-diagnostics/network-check-target-plw6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:18.518559 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:18.518440 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt podName:4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:34.518424994 +0000 UTC m=+34.463134425 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mx5qt" (UniqueName: "kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt") pod "network-check-target-plw6c" (UID: "4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:18.636403 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:18.636361 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:18.636580 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:18.636495 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:19.635867 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:19.635833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:19.636351 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:19.635973 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:20.636472 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:20.636446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:20.636872 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:20.636530 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:21.636695 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.636369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:21.637300 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:21.636795 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:21.716638 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.716560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jsb2w" event={"ID":"392462d6-20a2-4842-bcbc-129ba91961ef","Type":"ContainerStarted","Data":"095b58975d5732ee77abddc854eba7939d74bbfbd65ff4bdfb862114f5dadc2c"} Apr 20 23:12:21.718059 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.718028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s6hls" event={"ID":"1652a387-e617-4579-bb1d-4fab03dacaed","Type":"ContainerStarted","Data":"76f465a3263125023a344623baecfb2a5d0ca8e8f426e896d5e28bdc4692cc17"} Apr 20 23:12:21.719468 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.719441 2576 generic.go:358] "Generic (PLEG): container finished" podID="97d5b486-1141-4ce1-b800-263ccf62a8cd" containerID="ddd5d11e3eac70e1a5f4889a5c50093ec0b5172bdf7f0dba13e96f10d5888140" exitCode=0 Apr 20 23:12:21.719585 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.719512 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" event={"ID":"97d5b486-1141-4ce1-b800-263ccf62a8cd","Type":"ContainerDied","Data":"ddd5d11e3eac70e1a5f4889a5c50093ec0b5172bdf7f0dba13e96f10d5888140"} Apr 20 23:12:21.720969 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.720932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9pvbx" event={"ID":"c89ec650-7091-4fbe-a329-ba849fc5e589","Type":"ContainerStarted","Data":"dcdc566f8764d030c7f9b1f09df52c8383d18b17aa39515cff87e5ea0c65dedc"} Apr 20 23:12:21.723700 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.723682 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:12:21.724012 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.723990 2576 generic.go:358] "Generic (PLEG): container finished" podID="65a2a89a-c0bf-4140-bd21-e8249221ca05" containerID="91142a8eaad0b9431da22d453bcf5fc400b680c7b8e4097344e0f6c313232b59" exitCode=1 Apr 20 23:12:21.724096 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.724057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" event={"ID":"65a2a89a-c0bf-4140-bd21-e8249221ca05","Type":"ContainerStarted","Data":"b9ae33e98ddaa2a8b0747794ce5a1d7680bf0fdc65d7cc7604dce74ec5d68e60"} Apr 20 23:12:21.724096 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.724085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" event={"ID":"65a2a89a-c0bf-4140-bd21-e8249221ca05","Type":"ContainerStarted","Data":"861b7b52bc502ed291e594bad46371479239fc0310d1a2d03accb7c58e6dc263"} Apr 20 23:12:21.724211 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.724098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" event={"ID":"65a2a89a-c0bf-4140-bd21-e8249221ca05","Type":"ContainerStarted","Data":"82618b2968851abc4cd88e335945e318c4e0d7dd45f03f80a253246b69692af1"} Apr 20 23:12:21.724211 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.724109 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" event={"ID":"65a2a89a-c0bf-4140-bd21-e8249221ca05","Type":"ContainerStarted","Data":"5dd2c7e05a93c20c2d76351c5a4f74b905fb3493bf89bbff604502586331c554"} Apr 20 23:12:21.724211 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.724122 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" event={"ID":"65a2a89a-c0bf-4140-bd21-e8249221ca05","Type":"ContainerDied","Data":"91142a8eaad0b9431da22d453bcf5fc400b680c7b8e4097344e0f6c313232b59"} Apr 20 23:12:21.724211 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.724154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" event={"ID":"65a2a89a-c0bf-4140-bd21-e8249221ca05","Type":"ContainerStarted","Data":"667e44b57123d81a9d4da68260b97efd0d4f603edb92efdbd973673966ced1f9"} Apr 20 23:12:21.725371 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.725350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" event={"ID":"99b18490-2395-4c7c-9ba0-f7c7b0ab7ee0","Type":"ContainerStarted","Data":"c89087194803bbb0549cedfbde8cfc300d0f3ba281247d0cd09dbccb690ff48d"} Apr 20 23:12:21.726795 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.726774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnjdb" event={"ID":"7577b13a-1450-42fb-aa2f-4374c2a72406","Type":"ContainerStarted","Data":"bf71277eaf0f7ca10c2b0ddf0333a80b763811627791b4d538e68d63abd6eb6f"} Apr 20 23:12:21.728168 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.728124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" event={"ID":"9aff2cd3-cf36-4053-81c6-5808527407bf","Type":"ContainerStarted","Data":"adb14d30841333c1ad44f518c379f785f5f89c181e53caf7c918bf9350cb34a3"} Apr 20 23:12:21.728698 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.728664 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jsb2w" podStartSLOduration=4.395146286 podStartE2EDuration="21.728653395s" podCreationTimestamp="2026-04-20 23:12:00 +0000 UTC" firstStartedPulling="2026-04-20 23:12:03.522870115 +0000 UTC m=+3.467579563" lastFinishedPulling="2026-04-20 23:12:20.856377235 +0000 UTC m=+20.801086672" observedRunningTime="2026-04-20 23:12:21.728208421 +0000 UTC m=+21.672917873" watchObservedRunningTime="2026-04-20 23:12:21.728653395 +0000 UTC m=+21.673362846" Apr 20 23:12:21.741414 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.741378 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jmbhc" podStartSLOduration=3.401186476 podStartE2EDuration="20.741368525s" podCreationTimestamp="2026-04-20 23:12:01 +0000 UTC" firstStartedPulling="2026-04-20 23:12:03.518971549 +0000 UTC m=+3.463680992" lastFinishedPulling="2026-04-20 23:12:20.859153602 +0000 UTC m=+20.803863041" observedRunningTime="2026-04-20 23:12:21.740914774 +0000 UTC m=+21.685624224" watchObservedRunningTime="2026-04-20 23:12:21.741368525 +0000 UTC m=+21.686077991" Apr 20 23:12:21.757424 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.757382 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jnjdb" podStartSLOduration=4.357734503 podStartE2EDuration="21.757372564s" podCreationTimestamp="2026-04-20 23:12:00 +0000 UTC" firstStartedPulling="2026-04-20 23:12:03.516702975 +0000 UTC m=+3.461412409" lastFinishedPulling="2026-04-20 23:12:20.916341035 +0000 UTC m=+20.861050470" observedRunningTime="2026-04-20 23:12:21.757208791 +0000 UTC m=+21.701918253" watchObservedRunningTime="2026-04-20 23:12:21.757372564 +0000 UTC m=+21.702082015" Apr 20 23:12:21.782682 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.782645 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-s6hls" podStartSLOduration=12.363780624 podStartE2EDuration="21.782632931s" podCreationTimestamp="2026-04-20 23:12:00 +0000 UTC" firstStartedPulling="2026-04-20 23:12:03.52259257 +0000 UTC m=+3.467302014" lastFinishedPulling="2026-04-20 23:12:12.94144488 +0000 UTC m=+12.886154321" observedRunningTime="2026-04-20 23:12:21.782342016 +0000 UTC m=+21.727051467" watchObservedRunningTime="2026-04-20 23:12:21.782632931 +0000 UTC m=+21.727342382" Apr 20 23:12:21.844941 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.844890 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9pvbx" podStartSLOduration=4.674620799 podStartE2EDuration="21.844871671s" podCreationTimestamp="2026-04-20 23:12:00 +0000 UTC" firstStartedPulling="2026-04-20 23:12:03.522469291 +0000 UTC m=+3.467178735" lastFinishedPulling="2026-04-20 23:12:20.692720173 +0000 UTC m=+20.637429607" observedRunningTime="2026-04-20 23:12:21.844786063 +0000 UTC m=+21.789495633" watchObservedRunningTime="2026-04-20 23:12:21.844871671 +0000 UTC m=+21.789581123" Apr 20 23:12:21.961336 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:21.961313 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 23:12:22.636061 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:22.635984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:22.636238 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:22.636109 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:22.648710 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:22.648605 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T23:12:21.96132949Z","UUID":"82d39799-c819-49e3-9ec2-32f377dfd7ea","Handler":null,"Name":"","Endpoint":""} Apr 20 23:12:22.652191 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:22.651444 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 23:12:22.652191 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:22.652164 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 23:12:22.731038 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:22.731001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mmw6l" event={"ID":"21bb60a5-8c2e-4d57-bf71-b44519293c10","Type":"ContainerStarted","Data":"7ba87a0c975c2a101bbadf17f9b7caf552af856c2ee3aaa65120179451fc18e7"} Apr 20 23:12:22.732811 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:22.732783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" event={"ID":"9aff2cd3-cf36-4053-81c6-5808527407bf","Type":"ContainerStarted","Data":"7b9d0de335d476b08b4410e1a16dbce7dc12a61da148894fc441d47ed70d2b9e"} Apr 20 23:12:22.753209 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:22.753168 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mmw6l" podStartSLOduration=4.390187029 podStartE2EDuration="21.753153367s" podCreationTimestamp="2026-04-20 23:12:01 +0000 UTC" firstStartedPulling="2026-04-20 23:12:03.493430308 +0000 UTC m=+3.438139745" lastFinishedPulling="2026-04-20 23:12:20.856396641 +0000 UTC m=+20.801106083" observedRunningTime="2026-04-20 23:12:22.752931374 +0000 UTC m=+22.697640825" watchObservedRunningTime="2026-04-20 23:12:22.753153367 +0000 UTC m=+22.697862809" Apr 20 23:12:23.238406 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:23.238180 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:23.238760 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:23.238737 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:23.636405 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:23.636333 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:23.636612 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:23.636447 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:23.737969 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:23.737941 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:12:23.738432 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:23.738386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" event={"ID":"65a2a89a-c0bf-4140-bd21-e8249221ca05","Type":"ContainerStarted","Data":"f7a9abf5a91455e43b88e0191415b1df11439f2f285ea01341d22a51f36db598"} Apr 20 23:12:23.740223 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:23.740188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" event={"ID":"9aff2cd3-cf36-4053-81c6-5808527407bf","Type":"ContainerStarted","Data":"119ea984d344bc9cbb8341e935fcc902a5434d9e8eadbc91604c184cd94a6551"} Apr 20 23:12:23.740668 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:23.740631 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:23.741052 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:23.741033 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9pvbx" Apr 20 23:12:23.757227 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:23.757189 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8t59g" podStartSLOduration=4.499948482 podStartE2EDuration="23.757176004s" podCreationTimestamp="2026-04-20 23:12:00 +0000 UTC" firstStartedPulling="2026-04-20 23:12:03.49262947 +0000 UTC m=+3.437338899" lastFinishedPulling="2026-04-20 23:12:22.749856989 +0000 UTC m=+22.694566421" observedRunningTime="2026-04-20 23:12:23.75712486 +0000 UTC m=+23.701834311" watchObservedRunningTime="2026-04-20 23:12:23.757176004 +0000 UTC m=+23.701885454" Apr 20 23:12:24.636677 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:24.636640 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:24.636880 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:24.636776 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:25.636278 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:25.636241 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:25.636904 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:25.636381 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:26.636214 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:26.635998 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:26.636364 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:26.636292 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:26.747394 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:26.747363 2576 generic.go:358] "Generic (PLEG): container finished" podID="97d5b486-1141-4ce1-b800-263ccf62a8cd" containerID="c7b1c953950e4ddc629c1f6a374160e3895f7fc2958e638258f8c23de8cc9f3b" exitCode=0 Apr 20 23:12:26.747573 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:26.747434 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" event={"ID":"97d5b486-1141-4ce1-b800-263ccf62a8cd","Type":"ContainerDied","Data":"c7b1c953950e4ddc629c1f6a374160e3895f7fc2958e638258f8c23de8cc9f3b"} Apr 20 23:12:26.750471 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:26.750431 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:12:26.750809 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:26.750788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" event={"ID":"65a2a89a-c0bf-4140-bd21-e8249221ca05","Type":"ContainerStarted","Data":"7516f7acf2d17aa544bae5c844c945d7d9f46dc06e6ceb1235abcfc2d18c7d53"} Apr 20 23:12:26.751156 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:26.751116 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:26.751257 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:26.751159 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:26.751257 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:26.751175 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:26.751328 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:26.751311 2576 scope.go:117] "RemoveContainer" containerID="91142a8eaad0b9431da22d453bcf5fc400b680c7b8e4097344e0f6c313232b59" Apr 20 23:12:26.766808 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:26.766774 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:26.767306 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:26.767287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:12:27.636837 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:27.636811 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:27.637271 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:27.636918 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:27.737011 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:27.736982 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rvb5h"] Apr 20 23:12:27.748619 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:27.748593 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-plw6c"] Apr 20 23:12:27.748752 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:27.748739 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:27.748881 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:27.748857 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:27.757041 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:27.757015 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:12:27.757464 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:27.757441 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:27.757564 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:27.757449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" event={"ID":"65a2a89a-c0bf-4140-bd21-e8249221ca05","Type":"ContainerStarted","Data":"a2eae8a08ed36654a70525457cd04447369c409fd339320dc3ddce435390939c"} Apr 20 23:12:27.757630 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:27.757565 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:27.788065 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:27.788016 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" podStartSLOduration=10.379270828 podStartE2EDuration="27.787998097s" podCreationTimestamp="2026-04-20 23:12:00 +0000 UTC" firstStartedPulling="2026-04-20 23:12:03.514014781 +0000 UTC m=+3.458724225" lastFinishedPulling="2026-04-20 23:12:20.922742065 +0000 UTC m=+20.867451494" observedRunningTime="2026-04-20 23:12:27.786702027 +0000 UTC m=+27.731411478" watchObservedRunningTime="2026-04-20 23:12:27.787998097 +0000 UTC m=+27.732707551" Apr 20 23:12:28.761259 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:28.761224 2576 generic.go:358] "Generic (PLEG): container finished" podID="97d5b486-1141-4ce1-b800-263ccf62a8cd" containerID="c594d6dd11ff0d048f02337e00ae3d4b68e9aeda60ad7a7d717a419d1daecec9" exitCode=0 Apr 20 23:12:28.761713 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:28.761305 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" event={"ID":"97d5b486-1141-4ce1-b800-263ccf62a8cd","Type":"ContainerDied","Data":"c594d6dd11ff0d048f02337e00ae3d4b68e9aeda60ad7a7d717a419d1daecec9"} Apr 20 23:12:29.636510 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:29.636485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:29.636645 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:29.636487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:29.636645 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:29.636588 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:29.639519 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:29.636699 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:29.766426 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:29.766239 2576 generic.go:358] "Generic (PLEG): container finished" podID="97d5b486-1141-4ce1-b800-263ccf62a8cd" containerID="3da100360897f742355baa500bc0e7fa0b3114bfe39e72ac106c9f31f0df90c6" exitCode=0 Apr 20 23:12:29.766426 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:29.766299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" event={"ID":"97d5b486-1141-4ce1-b800-263ccf62a8cd","Type":"ContainerDied","Data":"3da100360897f742355baa500bc0e7fa0b3114bfe39e72ac106c9f31f0df90c6"} Apr 20 23:12:31.636693 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:31.636661 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:31.637271 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:31.636661 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:31.637271 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:31.636790 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-plw6c" podUID="4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98" Apr 20 23:12:31.637271 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:31.636896 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rvb5h" podUID="e6e1d353-f530-4ad5-a0ae-b436e227eb58" Apr 20 23:12:32.859695 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.859600 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-139.ec2.internal" event="NodeReady" Apr 20 23:12:32.860310 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.859755 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 23:12:32.895715 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.895675 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9"] Apr 20 23:12:32.936177 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.936105 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6c46bdb5c9-xflgd"] Apr 20 23:12:32.936493 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.936466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" Apr 20 23:12:32.939907 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.939461 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:12:32.939907 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.939495 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-d44v4\"" Apr 20 23:12:32.939907 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.939499 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 23:12:32.939907 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.939518 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 23:12:32.939907 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.939467 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 23:12:32.961604 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.961576 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l"] Apr 20 23:12:32.961769 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.961754 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:32.964330 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.964294 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 23:12:32.964330 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.964306 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 23:12:32.964488 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.964431 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xgs9r\"" Apr 20 23:12:32.964569 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.964526 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 23:12:32.969225 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.969205 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 23:12:32.974196 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.974177 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9"] Apr 20 23:12:32.974299 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.974203 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kj9pq"] Apr 20 23:12:32.974343 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.974319 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" Apr 20 23:12:32.976726 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.976690 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 23:12:32.976836 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.976751 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 23:12:32.976836 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.976766 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 23:12:32.977007 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.976981 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:12:32.977103 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.977034 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-vtsmm\"" Apr 20 23:12:32.994724 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.994704 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-h242g"] Apr 20 23:12:32.994899 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.994872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kj9pq" Apr 20 23:12:32.997207 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.997185 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 23:12:32.997411 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.997393 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-4x6jk\"" Apr 20 23:12:32.997522 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:32.997499 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 23:12:33.013928 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.013838 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c46bdb5c9-xflgd"] Apr 20 23:12:33.014065 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.013944 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kj9pq"] Apr 20 23:12:33.014065 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.013962 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-h242g"] Apr 20 23:12:33.014065 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.013964 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:12:33.014065 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.013975 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l"] Apr 20 23:12:33.016538 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.016430 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 23:12:33.016538 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.016498 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-8ntsk\"" Apr 20 23:12:33.016538 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.016497 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 23:12:33.027809 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.027788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e-config\") pod \"service-ca-operator-d6fc45fc5-9hxw9\" (UID: \"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" Apr 20 23:12:33.027916 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.027849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9hxw9\" (UID: \"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" Apr 20 23:12:33.027916 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.027874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdp49\" (UniqueName: \"kubernetes.io/projected/d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e-kube-api-access-zdp49\") pod \"service-ca-operator-d6fc45fc5-9hxw9\" (UID: \"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.128505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.128767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl8bb\" (UniqueName: \"kubernetes.io/projected/b305831b-81a2-4f94-9080-df8443be7ee7-kube-api-access-vl8bb\") pod \"network-check-source-8894fc9bd-kj9pq\" (UID: \"b305831b-81a2-4f94-9080-df8443be7ee7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kj9pq" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.128813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-image-registry-private-configuration\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.128901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlclr\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-kube-api-access-rlclr\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.128981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-registry-certificates\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129026 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-trusted-ca\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9hxw9\" (UID: \"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdp49\" (UniqueName: \"kubernetes.io/projected/d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e-kube-api-access-zdp49\") pod \"service-ca-operator-d6fc45fc5-9hxw9\" (UID: \"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5675ad-e501-4f6b-a732-4d1db6454f9a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xg99l\" (UID: \"fa5675ad-e501-4f6b-a732-4d1db6454f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e-config\") pod \"service-ca-operator-d6fc45fc5-9hxw9\" (UID: \"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-bound-sa-token\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e672b75e-249e-4dd5-928c-641987109f81-ca-trust-extracted\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-installation-pull-secrets\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.130004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5675ad-e501-4f6b-a732-4d1db6454f9a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xg99l\" (UID: \"fa5675ad-e501-4f6b-a732-4d1db6454f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" Apr 20 23:12:33.130767 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp2f7\" (UniqueName: \"kubernetes.io/projected/fa5675ad-e501-4f6b-a732-4d1db6454f9a-kube-api-access-hp2f7\") pod \"kube-storage-version-migrator-operator-6769c5d45-xg99l\" (UID: \"fa5675ad-e501-4f6b-a732-4d1db6454f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" Apr 20 23:12:33.130767 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129462 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:12:33.130767 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.129492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.132711 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.132687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e-config\") pod \"service-ca-operator-d6fc45fc5-9hxw9\" (UID: \"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" Apr 20 23:12:33.134916 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.134893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9hxw9\" (UID: \"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" Apr 20 23:12:33.152485 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.152456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdp49\" (UniqueName: \"kubernetes.io/projected/d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e-kube-api-access-zdp49\") pod \"service-ca-operator-d6fc45fc5-9hxw9\" (UID: \"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" Apr 20 23:12:33.229989 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.229949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl8bb\" (UniqueName: \"kubernetes.io/projected/b305831b-81a2-4f94-9080-df8443be7ee7-kube-api-access-vl8bb\") pod \"network-check-source-8894fc9bd-kj9pq\" (UID: \"b305831b-81a2-4f94-9080-df8443be7ee7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kj9pq" Apr 20 23:12:33.230184 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.229996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-image-registry-private-configuration\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.230184 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlclr\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-kube-api-access-rlclr\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.230184 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-registry-certificates\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.230184 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-trusted-ca\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.230404 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5675ad-e501-4f6b-a732-4d1db6454f9a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xg99l\" (UID: \"fa5675ad-e501-4f6b-a732-4d1db6454f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" Apr 20 23:12:33.230404 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-bound-sa-token\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.230404 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e672b75e-249e-4dd5-928c-641987109f81-ca-trust-extracted\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.230404 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-installation-pull-secrets\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.230588 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5675ad-e501-4f6b-a732-4d1db6454f9a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xg99l\" (UID: \"fa5675ad-e501-4f6b-a732-4d1db6454f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" Apr 20 23:12:33.230588 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hp2f7\" (UniqueName: \"kubernetes.io/projected/fa5675ad-e501-4f6b-a732-4d1db6454f9a-kube-api-access-hp2f7\") pod \"kube-storage-version-migrator-operator-6769c5d45-xg99l\" (UID: \"fa5675ad-e501-4f6b-a732-4d1db6454f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" Apr 20 23:12:33.230588 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:12:33.230588 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.230588 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:12:33.230838 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e672b75e-249e-4dd5-928c-641987109f81-ca-trust-extracted\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.230936 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.230918 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 23:12:33.230994 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.230951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-registry-certificates\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.231054 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.230999 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert podName:b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:33.730979048 +0000 UTC m=+33.675688496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h242g" (UID: "b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726") : secret "networking-console-plugin-cert" not found Apr 20 23:12:33.231111 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.231053 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 23:12:33.231111 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.231066 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c46bdb5c9-xflgd: secret "image-registry-tls" not found Apr 20 23:12:33.231230 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.231123 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls podName:e672b75e-249e-4dd5-928c-641987109f81 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:33.731105419 +0000 UTC m=+33.675814854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls") pod "image-registry-6c46bdb5c9-xflgd" (UID: "e672b75e-249e-4dd5-928c-641987109f81") : secret "image-registry-tls" not found Apr 20 23:12:33.231291 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.231251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5675ad-e501-4f6b-a732-4d1db6454f9a-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xg99l\" (UID: \"fa5675ad-e501-4f6b-a732-4d1db6454f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" Apr 20 23:12:33.231350 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.231310 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:12:33.233155 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.233114 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-installation-pull-secrets\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.233448 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.233422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa5675ad-e501-4f6b-a732-4d1db6454f9a-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xg99l\" (UID: \"fa5675ad-e501-4f6b-a732-4d1db6454f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" Apr 20 23:12:33.233448 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.233433 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-image-registry-private-configuration\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.242064 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.242040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlclr\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-kube-api-access-rlclr\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.243443 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.243419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-bound-sa-token\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.243610 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.243588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl8bb\" (UniqueName: \"kubernetes.io/projected/b305831b-81a2-4f94-9080-df8443be7ee7-kube-api-access-vl8bb\") pod \"network-check-source-8894fc9bd-kj9pq\" (UID: \"b305831b-81a2-4f94-9080-df8443be7ee7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kj9pq" Apr 20 23:12:33.245353 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.245330 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-trusted-ca\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.246399 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.246375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp2f7\" (UniqueName: \"kubernetes.io/projected/fa5675ad-e501-4f6b-a732-4d1db6454f9a-kube-api-access-hp2f7\") pod \"kube-storage-version-migrator-operator-6769c5d45-xg99l\" (UID: \"fa5675ad-e501-4f6b-a732-4d1db6454f9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" Apr 20 23:12:33.248337 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.248298 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" Apr 20 23:12:33.277238 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.277206 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-c6jf8"] Apr 20 23:12:33.284415 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.284387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" Apr 20 23:12:33.300501 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.300473 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dzpv8"] Apr 20 23:12:33.300662 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.300641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c6jf8" Apr 20 23:12:33.303379 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.303358 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 23:12:33.309774 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.309751 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c6jf8"] Apr 20 23:12:33.309774 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.309778 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dzpv8"] Apr 20 23:12:33.309933 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.309897 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.312260 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.312238 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 23:12:33.312375 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.312247 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 23:12:33.312375 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.312263 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2fmt6\"" Apr 20 23:12:33.314331 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.314314 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kj9pq" Apr 20 23:12:33.344912 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.344885 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2mlnp"] Apr 20 23:12:33.360226 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.360189 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2mlnp"] Apr 20 23:12:33.360406 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.360313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:12:33.362715 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.362694 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 23:12:33.363039 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.363020 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4dh2j\"" Apr 20 23:12:33.363165 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.363043 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 23:12:33.368225 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.368202 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 23:12:33.432431 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.432348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9dc3bbec-6557-4eef-8534-77d4570a546d-dbus\") pod \"global-pull-secret-syncer-c6jf8\" (UID: \"9dc3bbec-6557-4eef-8534-77d4570a546d\") " pod="kube-system/global-pull-secret-syncer-c6jf8" Apr 20 23:12:33.432431 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.432417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9dc3bbec-6557-4eef-8534-77d4570a546d-original-pull-secret\") pod \"global-pull-secret-syncer-c6jf8\" (UID: \"9dc3bbec-6557-4eef-8534-77d4570a546d\") " pod="kube-system/global-pull-secret-syncer-c6jf8" Apr 20 23:12:33.432617 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.432482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9dc3bbec-6557-4eef-8534-77d4570a546d-kubelet-config\") pod \"global-pull-secret-syncer-c6jf8\" (UID: \"9dc3bbec-6557-4eef-8534-77d4570a546d\") " pod="kube-system/global-pull-secret-syncer-c6jf8" Apr 20 23:12:33.432617 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.432536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/40424f06-d2df-4a03-bc2c-0af9d8b4e184-tmp-dir\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.432617 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.432580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn7r9\" (UniqueName: \"kubernetes.io/projected/40424f06-d2df-4a03-bc2c-0af9d8b4e184-kube-api-access-vn7r9\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.432750 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.432623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40424f06-d2df-4a03-bc2c-0af9d8b4e184-config-volume\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.432750 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.432648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.533960 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.533926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9dc3bbec-6557-4eef-8534-77d4570a546d-dbus\") pod \"global-pull-secret-syncer-c6jf8\" (UID: \"9dc3bbec-6557-4eef-8534-77d4570a546d\") " pod="kube-system/global-pull-secret-syncer-c6jf8" Apr 20 23:12:33.534125 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.533980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:12:33.534125 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.534017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9dc3bbec-6557-4eef-8534-77d4570a546d-original-pull-secret\") pod \"global-pull-secret-syncer-c6jf8\" (UID: \"9dc3bbec-6557-4eef-8534-77d4570a546d\") " pod="kube-system/global-pull-secret-syncer-c6jf8" Apr 20 23:12:33.534125 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.534050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9dc3bbec-6557-4eef-8534-77d4570a546d-kubelet-config\") pod \"global-pull-secret-syncer-c6jf8\" (UID: \"9dc3bbec-6557-4eef-8534-77d4570a546d\") " pod="kube-system/global-pull-secret-syncer-c6jf8" Apr 20 23:12:33.534125 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.534076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/40424f06-d2df-4a03-bc2c-0af9d8b4e184-tmp-dir\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.534125 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.534100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spjfw\" (UniqueName: \"kubernetes.io/projected/204aec3c-9787-4646-abd7-68cf7063e0c5-kube-api-access-spjfw\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:12:33.534357 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.534128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn7r9\" (UniqueName: \"kubernetes.io/projected/40424f06-d2df-4a03-bc2c-0af9d8b4e184-kube-api-access-vn7r9\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.534357 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.534174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9dc3bbec-6557-4eef-8534-77d4570a546d-kubelet-config\") pod \"global-pull-secret-syncer-c6jf8\" (UID: \"9dc3bbec-6557-4eef-8534-77d4570a546d\") " pod="kube-system/global-pull-secret-syncer-c6jf8" Apr 20 23:12:33.534357 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.534182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40424f06-d2df-4a03-bc2c-0af9d8b4e184-config-volume\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.534357 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.534210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9dc3bbec-6557-4eef-8534-77d4570a546d-dbus\") pod \"global-pull-secret-syncer-c6jf8\" (UID: \"9dc3bbec-6557-4eef-8534-77d4570a546d\") " pod="kube-system/global-pull-secret-syncer-c6jf8" Apr 20 23:12:33.534357 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.534233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.534357 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.534342 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 23:12:33.534580 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.534399 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls podName:40424f06-d2df-4a03-bc2c-0af9d8b4e184 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:34.034379546 +0000 UTC m=+33.979088978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls") pod "dns-default-dzpv8" (UID: "40424f06-d2df-4a03-bc2c-0af9d8b4e184") : secret "dns-default-metrics-tls" not found Apr 20 23:12:33.534652 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.534576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/40424f06-d2df-4a03-bc2c-0af9d8b4e184-tmp-dir\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.534759 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.534737 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40424f06-d2df-4a03-bc2c-0af9d8b4e184-config-volume\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.536561 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.536543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9dc3bbec-6557-4eef-8534-77d4570a546d-original-pull-secret\") pod \"global-pull-secret-syncer-c6jf8\" (UID: \"9dc3bbec-6557-4eef-8534-77d4570a546d\") " pod="kube-system/global-pull-secret-syncer-c6jf8" Apr 20 23:12:33.543816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.543797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn7r9\" (UniqueName: \"kubernetes.io/projected/40424f06-d2df-4a03-bc2c-0af9d8b4e184-kube-api-access-vn7r9\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:33.611175 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.611125 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c6jf8" Apr 20 23:12:33.635462 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.635435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:12:33.635598 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.635492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spjfw\" (UniqueName: \"kubernetes.io/projected/204aec3c-9787-4646-abd7-68cf7063e0c5-kube-api-access-spjfw\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:12:33.635657 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.635590 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 23:12:33.635717 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.635676 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert podName:204aec3c-9787-4646-abd7-68cf7063e0c5 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:34.135651575 +0000 UTC m=+34.080361018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert") pod "ingress-canary-2mlnp" (UID: "204aec3c-9787-4646-abd7-68cf7063e0c5") : secret "canary-serving-cert" not found Apr 20 23:12:33.635915 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.635892 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:33.636019 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.635893 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:33.638571 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.638551 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-g6bnb\"" Apr 20 23:12:33.638676 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.638581 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 23:12:33.638676 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.638621 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sk2w6\"" Apr 20 23:12:33.645458 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.645440 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spjfw\" (UniqueName: \"kubernetes.io/projected/204aec3c-9787-4646-abd7-68cf7063e0c5-kube-api-access-spjfw\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:12:33.736949 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.736858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:12:33.736949 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:33.736905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:33.737194 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.736996 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 23:12:33.737194 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.737057 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 23:12:33.737194 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.737072 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c46bdb5c9-xflgd: secret "image-registry-tls" not found Apr 20 23:12:33.737194 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.737079 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert podName:b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:34.737054579 +0000 UTC m=+34.681764027 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h242g" (UID: "b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726") : secret "networking-console-plugin-cert" not found Apr 20 23:12:33.737194 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:33.737116 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls podName:e672b75e-249e-4dd5-928c-641987109f81 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:34.73710196 +0000 UTC m=+34.681811407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls") pod "image-registry-6c46bdb5c9-xflgd" (UID: "e672b75e-249e-4dd5-928c-641987109f81") : secret "image-registry-tls" not found Apr 20 23:12:34.039932 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:34.039834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:34.040628 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:34.040039 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 23:12:34.040628 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:34.040098 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls podName:40424f06-d2df-4a03-bc2c-0af9d8b4e184 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:35.04008535 +0000 UTC m=+34.984794778 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls") pod "dns-default-dzpv8" (UID: "40424f06-d2df-4a03-bc2c-0af9d8b4e184") : secret "dns-default-metrics-tls" not found Apr 20 23:12:34.141057 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:34.140987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:12:34.141237 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:34.141176 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 23:12:34.141303 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:34.141277 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert podName:204aec3c-9787-4646-abd7-68cf7063e0c5 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:35.141252702 +0000 UTC m=+35.085962161 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert") pod "ingress-canary-2mlnp" (UID: "204aec3c-9787-4646-abd7-68cf7063e0c5") : secret "canary-serving-cert" not found Apr 20 23:12:34.444371 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:34.444286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:12:34.444525 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:34.444480 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 23:12:34.444569 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:34.444561 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs podName:e6e1d353-f530-4ad5-a0ae-b436e227eb58 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:06.444540692 +0000 UTC m=+66.389250125 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs") pod "network-metrics-daemon-rvb5h" (UID: "e6e1d353-f530-4ad5-a0ae-b436e227eb58") : secret "metrics-daemon-secret" not found Apr 20 23:12:34.545304 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:34.545265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5qt\" (UniqueName: \"kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt\") pod \"network-check-target-plw6c\" (UID: \"4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98\") " pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:34.548106 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:34.548078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx5qt\" (UniqueName: \"kubernetes.io/projected/4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98-kube-api-access-mx5qt\") pod \"network-check-target-plw6c\" (UID: \"4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98\") " pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:34.747765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:34.747682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:12:34.747765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:34.747722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:34.747963 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:34.747842 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 23:12:34.747963 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:34.747880 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 23:12:34.747963 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:34.747894 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c46bdb5c9-xflgd: secret "image-registry-tls" not found Apr 20 23:12:34.747963 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:34.747920 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert podName:b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:36.747898917 +0000 UTC m=+36.692608349 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h242g" (UID: "b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726") : secret "networking-console-plugin-cert" not found Apr 20 23:12:34.747963 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:34.747944 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls podName:e672b75e-249e-4dd5-928c-641987109f81 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:36.747934265 +0000 UTC m=+36.692643698 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls") pod "image-registry-6c46bdb5c9-xflgd" (UID: "e672b75e-249e-4dd5-928c-641987109f81") : secret "image-registry-tls" not found Apr 20 23:12:34.846973 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:34.846937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:35.050031 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.049953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:35.050445 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:35.050109 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 23:12:35.050445 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:35.050202 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls podName:40424f06-d2df-4a03-bc2c-0af9d8b4e184 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:37.050181866 +0000 UTC m=+36.994891295 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls") pod "dns-default-dzpv8" (UID: "40424f06-d2df-4a03-bc2c-0af9d8b4e184") : secret "dns-default-metrics-tls" not found Apr 20 23:12:35.150468 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.150426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:12:35.150647 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:35.150596 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 23:12:35.150696 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:35.150671 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert podName:204aec3c-9787-4646-abd7-68cf7063e0c5 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:37.150652929 +0000 UTC m=+37.095362372 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert") pod "ingress-canary-2mlnp" (UID: "204aec3c-9787-4646-abd7-68cf7063e0c5") : secret "canary-serving-cert" not found Apr 20 23:12:35.508563 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.508324 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l"] Apr 20 23:12:35.512778 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.512738 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c6jf8"] Apr 20 23:12:35.513540 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.513516 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kj9pq"] Apr 20 23:12:35.521278 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.521251 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9"] Apr 20 23:12:35.523095 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:35.523058 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa5675ad_e501_4f6b_a732_4d1db6454f9a.slice/crio-e1af3bb8102e519b646029fd7aafee291dac2ea96b19123a0d3830968f617de4 WatchSource:0}: Error finding container e1af3bb8102e519b646029fd7aafee291dac2ea96b19123a0d3830968f617de4: Status 404 returned error can't find the container with id e1af3bb8102e519b646029fd7aafee291dac2ea96b19123a0d3830968f617de4 Apr 20 23:12:35.524271 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.524247 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-plw6c"] Apr 20 23:12:35.526022 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:35.525992 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dc3bbec_6557_4eef_8534_77d4570a546d.slice/crio-fd1113ea466c1a9d3e6eb43005f189496b45d978882beac40d96f1afa5d268e2 WatchSource:0}: Error finding container fd1113ea466c1a9d3e6eb43005f189496b45d978882beac40d96f1afa5d268e2: Status 404 returned error can't find the container with id fd1113ea466c1a9d3e6eb43005f189496b45d978882beac40d96f1afa5d268e2 Apr 20 23:12:35.526469 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:35.526383 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5f05570_1fb6_4cf8_b42b_05a9a4e6ac9e.slice/crio-a5394e9d13fb2848a66fb5238951d79b5ed19618c667b8733f63564b91a98714 WatchSource:0}: Error finding container a5394e9d13fb2848a66fb5238951d79b5ed19618c667b8733f63564b91a98714: Status 404 returned error can't find the container with id a5394e9d13fb2848a66fb5238951d79b5ed19618c667b8733f63564b91a98714 Apr 20 23:12:35.527486 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:35.527463 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a6bf5ff_5cb3_4488_b6f3_a6d5e5680a98.slice/crio-6cdf4347b515415e5f6f19ef23851cd8b451928f124bfccbad68c81af8a9556a WatchSource:0}: Error finding container 6cdf4347b515415e5f6f19ef23851cd8b451928f124bfccbad68c81af8a9556a: Status 404 returned error can't find the container with id 6cdf4347b515415e5f6f19ef23851cd8b451928f124bfccbad68c81af8a9556a Apr 20 23:12:35.779159 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.779108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kj9pq" event={"ID":"b305831b-81a2-4f94-9080-df8443be7ee7","Type":"ContainerStarted","Data":"fe1572772518e3568204a735939e0cca78be052817627639830d971e0e93d4d5"} Apr 20 23:12:35.780264 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.780239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-plw6c" event={"ID":"4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98","Type":"ContainerStarted","Data":"6cdf4347b515415e5f6f19ef23851cd8b451928f124bfccbad68c81af8a9556a"} Apr 20 23:12:35.782719 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.782688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" event={"ID":"97d5b486-1141-4ce1-b800-263ccf62a8cd","Type":"ContainerStarted","Data":"8f41276b3a9e62037fc6baaa1d1957de5c281a0063512a6dcb376fbca4b05f62"} Apr 20 23:12:35.783778 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.783745 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" event={"ID":"fa5675ad-e501-4f6b-a732-4d1db6454f9a","Type":"ContainerStarted","Data":"e1af3bb8102e519b646029fd7aafee291dac2ea96b19123a0d3830968f617de4"} Apr 20 23:12:35.784815 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.784785 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c6jf8" event={"ID":"9dc3bbec-6557-4eef-8534-77d4570a546d","Type":"ContainerStarted","Data":"fd1113ea466c1a9d3e6eb43005f189496b45d978882beac40d96f1afa5d268e2"} Apr 20 23:12:35.785756 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:35.785732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" event={"ID":"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e","Type":"ContainerStarted","Data":"a5394e9d13fb2848a66fb5238951d79b5ed19618c667b8733f63564b91a98714"} Apr 20 23:12:36.766101 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:36.765482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:12:36.766101 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:36.765545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:36.766101 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:36.765707 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 23:12:36.766101 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:36.765725 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c46bdb5c9-xflgd: secret "image-registry-tls" not found Apr 20 23:12:36.766101 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:36.765784 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls podName:e672b75e-249e-4dd5-928c-641987109f81 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:40.765765549 +0000 UTC m=+40.710474982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls") pod "image-registry-6c46bdb5c9-xflgd" (UID: "e672b75e-249e-4dd5-928c-641987109f81") : secret "image-registry-tls" not found Apr 20 23:12:36.766825 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:36.766209 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 23:12:36.766825 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:36.766288 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert podName:b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:40.766259566 +0000 UTC m=+40.710969008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h242g" (UID: "b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726") : secret "networking-console-plugin-cert" not found Apr 20 23:12:36.792726 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:36.792441 2576 generic.go:358] "Generic (PLEG): container finished" podID="97d5b486-1141-4ce1-b800-263ccf62a8cd" containerID="8f41276b3a9e62037fc6baaa1d1957de5c281a0063512a6dcb376fbca4b05f62" exitCode=0 Apr 20 23:12:36.792726 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:36.792517 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" event={"ID":"97d5b486-1141-4ce1-b800-263ccf62a8cd","Type":"ContainerDied","Data":"8f41276b3a9e62037fc6baaa1d1957de5c281a0063512a6dcb376fbca4b05f62"} Apr 20 23:12:37.068779 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:37.068744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:37.068942 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:37.068901 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 23:12:37.069011 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:37.068961 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls podName:40424f06-d2df-4a03-bc2c-0af9d8b4e184 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:41.068941995 +0000 UTC m=+41.013651430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls") pod "dns-default-dzpv8" (UID: "40424f06-d2df-4a03-bc2c-0af9d8b4e184") : secret "dns-default-metrics-tls" not found Apr 20 23:12:37.170215 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:37.170177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:12:37.170394 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:37.170348 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 23:12:37.170452 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:37.170413 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert podName:204aec3c-9787-4646-abd7-68cf7063e0c5 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:41.170394179 +0000 UTC m=+41.115103629 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert") pod "ingress-canary-2mlnp" (UID: "204aec3c-9787-4646-abd7-68cf7063e0c5") : secret "canary-serving-cert" not found Apr 20 23:12:37.800867 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:37.799847 2576 generic.go:358] "Generic (PLEG): container finished" podID="97d5b486-1141-4ce1-b800-263ccf62a8cd" containerID="b672c9e4b41e51dae33b17863ad108898aa81ad3612b1f720ba021b872deaaa0" exitCode=0 Apr 20 23:12:37.800867 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:37.799923 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" event={"ID":"97d5b486-1141-4ce1-b800-263ccf62a8cd","Type":"ContainerDied","Data":"b672c9e4b41e51dae33b17863ad108898aa81ad3612b1f720ba021b872deaaa0"} Apr 20 23:12:40.801902 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:40.801864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:12:40.802337 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:40.801910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:40.802337 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:40.802022 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 23:12:40.802337 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:40.802049 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 23:12:40.802337 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:40.802061 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c46bdb5c9-xflgd: secret "image-registry-tls" not found Apr 20 23:12:40.802337 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:40.802094 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert podName:b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:48.802075138 +0000 UTC m=+48.746784585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h242g" (UID: "b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726") : secret "networking-console-plugin-cert" not found Apr 20 23:12:40.802337 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:40.802112 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls podName:e672b75e-249e-4dd5-928c-641987109f81 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:48.802104301 +0000 UTC m=+48.746813733 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls") pod "image-registry-6c46bdb5c9-xflgd" (UID: "e672b75e-249e-4dd5-928c-641987109f81") : secret "image-registry-tls" not found Apr 20 23:12:41.104290 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:41.104195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:41.104450 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:41.104379 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 23:12:41.104493 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:41.104466 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls podName:40424f06-d2df-4a03-bc2c-0af9d8b4e184 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:49.10444198 +0000 UTC m=+49.049151413 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls") pod "dns-default-dzpv8" (UID: "40424f06-d2df-4a03-bc2c-0af9d8b4e184") : secret "dns-default-metrics-tls" not found Apr 20 23:12:41.205062 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:41.205026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:12:41.205235 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:41.205201 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 23:12:41.205297 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:41.205274 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert podName:204aec3c-9787-4646-abd7-68cf7063e0c5 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:49.205253701 +0000 UTC m=+49.149963132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert") pod "ingress-canary-2mlnp" (UID: "204aec3c-9787-4646-abd7-68cf7063e0c5") : secret "canary-serving-cert" not found Apr 20 23:12:42.813021 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.812925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kj9pq" event={"ID":"b305831b-81a2-4f94-9080-df8443be7ee7","Type":"ContainerStarted","Data":"ea4c6b535ab4bad9f3ec69de291456d37daa8047d6860db52171d101c20d8123"} Apr 20 23:12:42.814733 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.814700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-plw6c" event={"ID":"4a6bf5ff-5cb3-4488-b6f3-a6d5e5680a98","Type":"ContainerStarted","Data":"805053c64235afeeecfcf54456846d5e581acf33571b29a364e82f0ddca5af24"} Apr 20 23:12:42.814926 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.814850 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:12:42.818571 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.818536 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" event={"ID":"97d5b486-1141-4ce1-b800-263ccf62a8cd","Type":"ContainerStarted","Data":"03c64f69ea25e63f0f419f14e734b9d33f4a9701150f061025c1ce4d9afb5cac"} Apr 20 23:12:42.819974 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.819952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" event={"ID":"fa5675ad-e501-4f6b-a732-4d1db6454f9a","Type":"ContainerStarted","Data":"bb464a9df8660d01f83ac634b4aa521d7904b48602369ad18edc42bec05a0640"} Apr 20 23:12:42.821475 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.821453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c6jf8" event={"ID":"9dc3bbec-6557-4eef-8534-77d4570a546d","Type":"ContainerStarted","Data":"031aa510c36f053c2d34f7fbf3528ea6afecc807bd6ce20ada5fdf60b3e81f84"} Apr 20 23:12:42.823456 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.823430 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" event={"ID":"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e","Type":"ContainerStarted","Data":"cafa95223d7d4e7f82190382bd4eb35d0d7ccfa9e35ca3d1c12da9e11e987797"} Apr 20 23:12:42.828571 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.828525 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kj9pq" podStartSLOduration=31.155505754 podStartE2EDuration="37.828512698s" podCreationTimestamp="2026-04-20 23:12:05 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.535994271 +0000 UTC m=+35.480703700" lastFinishedPulling="2026-04-20 23:12:42.209001197 +0000 UTC m=+42.153710644" observedRunningTime="2026-04-20 23:12:42.827977382 +0000 UTC m=+42.772686835" watchObservedRunningTime="2026-04-20 23:12:42.828512698 +0000 UTC m=+42.773222150" Apr 20 23:12:42.842081 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.842042 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-plw6c" podStartSLOduration=35.168776252 podStartE2EDuration="41.842029464s" podCreationTimestamp="2026-04-20 23:12:01 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.53597536 +0000 UTC m=+35.480684789" lastFinishedPulling="2026-04-20 23:12:42.209228562 +0000 UTC m=+42.153938001" observedRunningTime="2026-04-20 23:12:42.841683416 +0000 UTC m=+42.786392868" watchObservedRunningTime="2026-04-20 23:12:42.842029464 +0000 UTC m=+42.786738932" Apr 20 23:12:42.859253 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.859213 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" podStartSLOduration=33.187405052 podStartE2EDuration="39.859199958s" podCreationTimestamp="2026-04-20 23:12:03 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.535884409 +0000 UTC m=+35.480593848" lastFinishedPulling="2026-04-20 23:12:42.207679314 +0000 UTC m=+42.152388754" observedRunningTime="2026-04-20 23:12:42.858104241 +0000 UTC m=+42.802813698" watchObservedRunningTime="2026-04-20 23:12:42.859199958 +0000 UTC m=+42.803909410" Apr 20 23:12:42.879254 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.879199 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qdvqj" podStartSLOduration=10.842179612 podStartE2EDuration="42.87918325s" podCreationTimestamp="2026-04-20 23:12:00 +0000 UTC" firstStartedPulling="2026-04-20 23:12:03.522804649 +0000 UTC m=+3.467514081" lastFinishedPulling="2026-04-20 23:12:35.559808287 +0000 UTC m=+35.504517719" observedRunningTime="2026-04-20 23:12:42.876857658 +0000 UTC m=+42.821567112" watchObservedRunningTime="2026-04-20 23:12:42.87918325 +0000 UTC m=+42.823892701" Apr 20 23:12:42.889774 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.889724 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-c6jf8" podStartSLOduration=2.90157484 podStartE2EDuration="9.88970564s" podCreationTimestamp="2026-04-20 23:12:33 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.53585483 +0000 UTC m=+35.480564266" lastFinishedPulling="2026-04-20 23:12:42.523985625 +0000 UTC m=+42.468695066" observedRunningTime="2026-04-20 23:12:42.889310638 +0000 UTC m=+42.834020092" watchObservedRunningTime="2026-04-20 23:12:42.88970564 +0000 UTC m=+42.834415095" Apr 20 23:12:42.906570 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:42.906519 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" podStartSLOduration=31.233325029 podStartE2EDuration="37.90650127s" podCreationTimestamp="2026-04-20 23:12:05 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.535896216 +0000 UTC m=+35.480605660" lastFinishedPulling="2026-04-20 23:12:42.209072453 +0000 UTC m=+42.153781901" observedRunningTime="2026-04-20 23:12:42.905238947 +0000 UTC m=+42.849948399" watchObservedRunningTime="2026-04-20 23:12:42.90650127 +0000 UTC m=+42.851210722" Apr 20 23:12:44.087217 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.087171 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7"] Apr 20 23:12:44.119420 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.119385 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7"] Apr 20 23:12:44.119593 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.119533 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7" Apr 20 23:12:44.122250 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.122222 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 23:12:44.122353 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.122227 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 23:12:44.123225 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.123203 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-8x55c\"" Apr 20 23:12:44.228774 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.228716 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nkpb\" (UniqueName: \"kubernetes.io/projected/849d7e34-d847-4179-b6ac-2c2766dce9e0-kube-api-access-9nkpb\") pod \"migrator-74bb7799d9-pr6j7\" (UID: \"849d7e34-d847-4179-b6ac-2c2766dce9e0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7" Apr 20 23:12:44.329264 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.329226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nkpb\" (UniqueName: \"kubernetes.io/projected/849d7e34-d847-4179-b6ac-2c2766dce9e0-kube-api-access-9nkpb\") pod \"migrator-74bb7799d9-pr6j7\" (UID: \"849d7e34-d847-4179-b6ac-2c2766dce9e0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7" Apr 20 23:12:44.339187 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.339109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nkpb\" (UniqueName: \"kubernetes.io/projected/849d7e34-d847-4179-b6ac-2c2766dce9e0-kube-api-access-9nkpb\") pod \"migrator-74bb7799d9-pr6j7\" (UID: \"849d7e34-d847-4179-b6ac-2c2766dce9e0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7" Apr 20 23:12:44.432236 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.432201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7" Apr 20 23:12:44.541766 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.541734 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7"] Apr 20 23:12:44.544476 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:44.544447 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod849d7e34_d847_4179_b6ac_2c2766dce9e0.slice/crio-0017aac16bacb18a856cb15906d81b8d0278beb813a13c52816ab23c3bed314d WatchSource:0}: Error finding container 0017aac16bacb18a856cb15906d81b8d0278beb813a13c52816ab23c3bed314d: Status 404 returned error can't find the container with id 0017aac16bacb18a856cb15906d81b8d0278beb813a13c52816ab23c3bed314d Apr 20 23:12:44.832130 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:44.832103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7" event={"ID":"849d7e34-d847-4179-b6ac-2c2766dce9e0","Type":"ContainerStarted","Data":"0017aac16bacb18a856cb15906d81b8d0278beb813a13c52816ab23c3bed314d"} Apr 20 23:12:46.237004 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.236970 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ldvkq"] Apr 20 23:12:46.253465 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.253436 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ldvkq"] Apr 20 23:12:46.253600 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.253574 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ldvkq" Apr 20 23:12:46.256080 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.256039 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 23:12:46.256080 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.256057 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 23:12:46.256080 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.256057 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 23:12:46.256352 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.256082 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 23:12:46.257060 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.257046 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-74nzr\"" Apr 20 23:12:46.346008 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.345988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e60de2df-2291-4e3e-8076-2cf6621e05ff-signing-key\") pod \"service-ca-865cb79987-ldvkq\" (UID: \"e60de2df-2291-4e3e-8076-2cf6621e05ff\") " pod="openshift-service-ca/service-ca-865cb79987-ldvkq" Apr 20 23:12:46.346068 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.346019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e60de2df-2291-4e3e-8076-2cf6621e05ff-signing-cabundle\") pod \"service-ca-865cb79987-ldvkq\" (UID: \"e60de2df-2291-4e3e-8076-2cf6621e05ff\") " pod="openshift-service-ca/service-ca-865cb79987-ldvkq" Apr 20 23:12:46.346068 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.346047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxktf\" (UniqueName: \"kubernetes.io/projected/e60de2df-2291-4e3e-8076-2cf6621e05ff-kube-api-access-fxktf\") pod \"service-ca-865cb79987-ldvkq\" (UID: \"e60de2df-2291-4e3e-8076-2cf6621e05ff\") " pod="openshift-service-ca/service-ca-865cb79987-ldvkq" Apr 20 23:12:46.447268 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.447226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e60de2df-2291-4e3e-8076-2cf6621e05ff-signing-key\") pod \"service-ca-865cb79987-ldvkq\" (UID: \"e60de2df-2291-4e3e-8076-2cf6621e05ff\") " pod="openshift-service-ca/service-ca-865cb79987-ldvkq" Apr 20 23:12:46.447268 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.447270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e60de2df-2291-4e3e-8076-2cf6621e05ff-signing-cabundle\") pod \"service-ca-865cb79987-ldvkq\" (UID: \"e60de2df-2291-4e3e-8076-2cf6621e05ff\") " pod="openshift-service-ca/service-ca-865cb79987-ldvkq" Apr 20 23:12:46.447536 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.447305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxktf\" (UniqueName: \"kubernetes.io/projected/e60de2df-2291-4e3e-8076-2cf6621e05ff-kube-api-access-fxktf\") pod \"service-ca-865cb79987-ldvkq\" (UID: \"e60de2df-2291-4e3e-8076-2cf6621e05ff\") " pod="openshift-service-ca/service-ca-865cb79987-ldvkq" Apr 20 23:12:46.447995 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.447970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e60de2df-2291-4e3e-8076-2cf6621e05ff-signing-cabundle\") pod \"service-ca-865cb79987-ldvkq\" (UID: \"e60de2df-2291-4e3e-8076-2cf6621e05ff\") " pod="openshift-service-ca/service-ca-865cb79987-ldvkq" Apr 20 23:12:46.449559 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.449543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e60de2df-2291-4e3e-8076-2cf6621e05ff-signing-key\") pod \"service-ca-865cb79987-ldvkq\" (UID: \"e60de2df-2291-4e3e-8076-2cf6621e05ff\") " pod="openshift-service-ca/service-ca-865cb79987-ldvkq" Apr 20 23:12:46.455465 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.455445 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxktf\" (UniqueName: \"kubernetes.io/projected/e60de2df-2291-4e3e-8076-2cf6621e05ff-kube-api-access-fxktf\") pod \"service-ca-865cb79987-ldvkq\" (UID: \"e60de2df-2291-4e3e-8076-2cf6621e05ff\") " pod="openshift-service-ca/service-ca-865cb79987-ldvkq" Apr 20 23:12:46.562828 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.562787 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ldvkq" Apr 20 23:12:46.676651 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.676613 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ldvkq"] Apr 20 23:12:46.687981 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:12:46.687951 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode60de2df_2291_4e3e_8076_2cf6621e05ff.slice/crio-4a70b2eb4dc1237b3e23ab2a24db2edaa3b8e0164d1db998a3c5daab792fdbce WatchSource:0}: Error finding container 4a70b2eb4dc1237b3e23ab2a24db2edaa3b8e0164d1db998a3c5daab792fdbce: Status 404 returned error can't find the container with id 4a70b2eb4dc1237b3e23ab2a24db2edaa3b8e0164d1db998a3c5daab792fdbce Apr 20 23:12:46.838819 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.838777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ldvkq" event={"ID":"e60de2df-2291-4e3e-8076-2cf6621e05ff","Type":"ContainerStarted","Data":"156ea48902a1fd3b1519fbdd6bcd69ebe71682bf8ce1abe6c6ea730790f829ad"} Apr 20 23:12:46.838819 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.838821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ldvkq" event={"ID":"e60de2df-2291-4e3e-8076-2cf6621e05ff","Type":"ContainerStarted","Data":"4a70b2eb4dc1237b3e23ab2a24db2edaa3b8e0164d1db998a3c5daab792fdbce"} Apr 20 23:12:46.840410 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.840381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7" event={"ID":"849d7e34-d847-4179-b6ac-2c2766dce9e0","Type":"ContainerStarted","Data":"d636de31261b143c0234694e2c2e44381ab7ac3f8ea571b6c5e32fae9ed6d240"} Apr 20 23:12:46.840502 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.840416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7" event={"ID":"849d7e34-d847-4179-b6ac-2c2766dce9e0","Type":"ContainerStarted","Data":"4418fd3834846ae6b7de515f927248185133c20e74dda8aeb501fdbbcff96874"} Apr 20 23:12:46.856760 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.856715 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-ldvkq" podStartSLOduration=0.856702155 podStartE2EDuration="856.702155ms" podCreationTimestamp="2026-04-20 23:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:12:46.855816596 +0000 UTC m=+46.800526072" watchObservedRunningTime="2026-04-20 23:12:46.856702155 +0000 UTC m=+46.801411606" Apr 20 23:12:46.878294 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:46.878252 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pr6j7" podStartSLOduration=1.07840461 podStartE2EDuration="2.878238472s" podCreationTimestamp="2026-04-20 23:12:44 +0000 UTC" firstStartedPulling="2026-04-20 23:12:44.546415739 +0000 UTC m=+44.491125171" lastFinishedPulling="2026-04-20 23:12:46.346249591 +0000 UTC m=+46.290959033" observedRunningTime="2026-04-20 23:12:46.878198097 +0000 UTC m=+46.822907549" watchObservedRunningTime="2026-04-20 23:12:46.878238472 +0000 UTC m=+46.822947922" Apr 20 23:12:48.867407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:48.867365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:12:48.867853 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:48.867418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:12:48.867853 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:48.867562 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 23:12:48.867853 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:48.867583 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c46bdb5c9-xflgd: secret "image-registry-tls" not found Apr 20 23:12:48.867853 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:48.867642 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls podName:e672b75e-249e-4dd5-928c-641987109f81 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:04.867622771 +0000 UTC m=+64.812332209 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls") pod "image-registry-6c46bdb5c9-xflgd" (UID: "e672b75e-249e-4dd5-928c-641987109f81") : secret "image-registry-tls" not found Apr 20 23:12:48.868098 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:48.868071 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 23:12:48.868197 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:48.868162 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert podName:b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:04.868123377 +0000 UTC m=+64.812832812 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h242g" (UID: "b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726") : secret "networking-console-plugin-cert" not found Apr 20 23:12:49.170696 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:49.170594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:12:49.170879 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:49.170756 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 23:12:49.170879 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:49.170825 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls podName:40424f06-d2df-4a03-bc2c-0af9d8b4e184 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:05.170809008 +0000 UTC m=+65.115518438 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls") pod "dns-default-dzpv8" (UID: "40424f06-d2df-4a03-bc2c-0af9d8b4e184") : secret "dns-default-metrics-tls" not found Apr 20 23:12:49.271262 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:49.271223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:12:49.271450 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:49.271387 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 23:12:49.271508 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:12:49.271467 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert podName:204aec3c-9787-4646-abd7-68cf7063e0c5 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:05.271446537 +0000 UTC m=+65.216155980 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert") pod "ingress-canary-2mlnp" (UID: "204aec3c-9787-4646-abd7-68cf7063e0c5") : secret "canary-serving-cert" not found Apr 20 23:12:58.772015 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:12:58.771987 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6675l" Apr 20 23:13:04.895161 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:04.895109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:13:04.895553 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:04.895166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:13:04.897576 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:04.897545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h242g\" (UID: \"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:13:04.897687 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:04.897583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls\") pod \"image-registry-6c46bdb5c9-xflgd\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:13:05.074869 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.074839 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xgs9r\"" Apr 20 23:13:05.082800 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.082781 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:13:05.126196 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.126167 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-8ntsk\"" Apr 20 23:13:05.134251 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.134219 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" Apr 20 23:13:05.197800 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.197771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:13:05.200357 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.200331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40424f06-d2df-4a03-bc2c-0af9d8b4e184-metrics-tls\") pod \"dns-default-dzpv8\" (UID: \"40424f06-d2df-4a03-bc2c-0af9d8b4e184\") " pod="openshift-dns/dns-default-dzpv8" Apr 20 23:13:05.206264 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.206239 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c46bdb5c9-xflgd"] Apr 20 23:13:05.209436 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:05.209408 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode672b75e_249e_4dd5_928c_641987109f81.slice/crio-365a50740812d3dc6462a61245d29cde32a65e6a7965e7777034454d161c4e07 WatchSource:0}: Error finding container 365a50740812d3dc6462a61245d29cde32a65e6a7965e7777034454d161c4e07: Status 404 returned error can't find the container with id 365a50740812d3dc6462a61245d29cde32a65e6a7965e7777034454d161c4e07 Apr 20 23:13:05.252876 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.252849 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-h242g"] Apr 20 23:13:05.257361 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:05.257322 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4b471b6_c5bc_4c1c_b7da_3a4ce9aaa726.slice/crio-dcfc3b0168ac02d155a3f90c9d9ca8d89206eaa40de1974aaaef9333d2a25e1f WatchSource:0}: Error finding container dcfc3b0168ac02d155a3f90c9d9ca8d89206eaa40de1974aaaef9333d2a25e1f: Status 404 returned error can't find the container with id dcfc3b0168ac02d155a3f90c9d9ca8d89206eaa40de1974aaaef9333d2a25e1f Apr 20 23:13:05.298354 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.298326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:13:05.300539 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.300518 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/204aec3c-9787-4646-abd7-68cf7063e0c5-cert\") pod \"ingress-canary-2mlnp\" (UID: \"204aec3c-9787-4646-abd7-68cf7063e0c5\") " pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:13:05.421542 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.421461 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2fmt6\"" Apr 20 23:13:05.429521 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.429498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dzpv8" Apr 20 23:13:05.473862 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.473796 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4dh2j\"" Apr 20 23:13:05.481803 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.481775 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2mlnp" Apr 20 23:13:05.547465 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.547435 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dzpv8"] Apr 20 23:13:05.550340 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:05.550307 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40424f06_d2df_4a03_bc2c_0af9d8b4e184.slice/crio-c3a1c1d4f53738c646b55f10a5e7801c9fec6890913625ea6403a01c42768ed9 WatchSource:0}: Error finding container c3a1c1d4f53738c646b55f10a5e7801c9fec6890913625ea6403a01c42768ed9: Status 404 returned error can't find the container with id c3a1c1d4f53738c646b55f10a5e7801c9fec6890913625ea6403a01c42768ed9 Apr 20 23:13:05.599274 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.599240 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2mlnp"] Apr 20 23:13:05.603442 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:05.603418 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod204aec3c_9787_4646_abd7_68cf7063e0c5.slice/crio-ca259efaee215969f67ceb3fac88628fcf5e43da12e1fef851ce609ff63d5400 WatchSource:0}: Error finding container ca259efaee215969f67ceb3fac88628fcf5e43da12e1fef851ce609ff63d5400: Status 404 returned error can't find the container with id ca259efaee215969f67ceb3fac88628fcf5e43da12e1fef851ce609ff63d5400 Apr 20 23:13:05.890708 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.890670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dzpv8" event={"ID":"40424f06-d2df-4a03-bc2c-0af9d8b4e184","Type":"ContainerStarted","Data":"c3a1c1d4f53738c646b55f10a5e7801c9fec6890913625ea6403a01c42768ed9"} Apr 20 23:13:05.891703 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.891671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" event={"ID":"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726","Type":"ContainerStarted","Data":"dcfc3b0168ac02d155a3f90c9d9ca8d89206eaa40de1974aaaef9333d2a25e1f"} Apr 20 23:13:05.892668 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.892638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2mlnp" event={"ID":"204aec3c-9787-4646-abd7-68cf7063e0c5","Type":"ContainerStarted","Data":"ca259efaee215969f67ceb3fac88628fcf5e43da12e1fef851ce609ff63d5400"} Apr 20 23:13:05.895972 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.895946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" event={"ID":"e672b75e-249e-4dd5-928c-641987109f81","Type":"ContainerStarted","Data":"ae8872eb233a41f6602c97117dc815506dea70445b5da8fabc28b459ab54c217"} Apr 20 23:13:05.896329 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.895976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" event={"ID":"e672b75e-249e-4dd5-928c-641987109f81","Type":"ContainerStarted","Data":"365a50740812d3dc6462a61245d29cde32a65e6a7965e7777034454d161c4e07"} Apr 20 23:13:05.896329 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.896100 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:13:05.918526 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:05.916386 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" podStartSLOduration=64.916368812 podStartE2EDuration="1m4.916368812s" podCreationTimestamp="2026-04-20 23:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:13:05.916089328 +0000 UTC m=+65.860798782" watchObservedRunningTime="2026-04-20 23:13:05.916368812 +0000 UTC m=+65.861078264" Apr 20 23:13:06.509257 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:06.509222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:13:06.512131 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:06.512071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6e1d353-f530-4ad5-a0ae-b436e227eb58-metrics-certs\") pod \"network-metrics-daemon-rvb5h\" (UID: \"e6e1d353-f530-4ad5-a0ae-b436e227eb58\") " pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:13:06.655454 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:06.655424 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sk2w6\"" Apr 20 23:13:06.663736 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:06.663706 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rvb5h" Apr 20 23:13:06.900411 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:06.900365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" event={"ID":"b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726","Type":"ContainerStarted","Data":"879edc760639b0bb14084492524d4be5a28a4886fa7533bb9e67795c31309c50"} Apr 20 23:13:06.915898 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:06.915852 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h242g" podStartSLOduration=33.700037814 podStartE2EDuration="34.915835938s" podCreationTimestamp="2026-04-20 23:12:32 +0000 UTC" firstStartedPulling="2026-04-20 23:13:05.259315266 +0000 UTC m=+65.204024696" lastFinishedPulling="2026-04-20 23:13:06.475113386 +0000 UTC m=+66.419822820" observedRunningTime="2026-04-20 23:13:06.915167268 +0000 UTC m=+66.859876720" watchObservedRunningTime="2026-04-20 23:13:06.915835938 +0000 UTC m=+66.860545388" Apr 20 23:13:07.623130 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:07.623088 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rvb5h"] Apr 20 23:13:07.626961 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:07.626933 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6e1d353_f530_4ad5_a0ae_b436e227eb58.slice/crio-f03fd35fcda952f276707f2bb41e57d117c83e64d23fe3def87770f0e35dd061 WatchSource:0}: Error finding container f03fd35fcda952f276707f2bb41e57d117c83e64d23fe3def87770f0e35dd061: Status 404 returned error can't find the container with id f03fd35fcda952f276707f2bb41e57d117c83e64d23fe3def87770f0e35dd061 Apr 20 23:13:07.904459 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:07.904414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2mlnp" event={"ID":"204aec3c-9787-4646-abd7-68cf7063e0c5","Type":"ContainerStarted","Data":"8ed21a1013821317151f09dedbac79e0a44f724b443f5d846d54288415fc4caf"} Apr 20 23:13:07.905900 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:07.905871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dzpv8" event={"ID":"40424f06-d2df-4a03-bc2c-0af9d8b4e184","Type":"ContainerStarted","Data":"2fd5230b70e39304f0134c09301d3f016c20735b0082a90827bc0a8c9a6f6f2a"} Apr 20 23:13:07.906082 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:07.905907 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dzpv8" event={"ID":"40424f06-d2df-4a03-bc2c-0af9d8b4e184","Type":"ContainerStarted","Data":"85493b6255bad82916cfcc27f026d7946a9d008dad79b984b05dad9c50ba3f4a"} Apr 20 23:13:07.906082 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:07.906006 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dzpv8" Apr 20 23:13:07.908989 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:07.908960 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rvb5h" event={"ID":"e6e1d353-f530-4ad5-a0ae-b436e227eb58","Type":"ContainerStarted","Data":"f03fd35fcda952f276707f2bb41e57d117c83e64d23fe3def87770f0e35dd061"} Apr 20 23:13:07.920879 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:07.920835 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2mlnp" podStartSLOduration=33.026917728 podStartE2EDuration="34.920823073s" podCreationTimestamp="2026-04-20 23:12:33 +0000 UTC" firstStartedPulling="2026-04-20 23:13:05.606024312 +0000 UTC m=+65.550733745" lastFinishedPulling="2026-04-20 23:13:07.499929647 +0000 UTC m=+67.444639090" observedRunningTime="2026-04-20 23:13:07.920004181 +0000 UTC m=+67.864713631" watchObservedRunningTime="2026-04-20 23:13:07.920823073 +0000 UTC m=+67.865532524" Apr 20 23:13:07.936711 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:07.936667 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dzpv8" podStartSLOduration=32.993529619 podStartE2EDuration="34.936654562s" podCreationTimestamp="2026-04-20 23:12:33 +0000 UTC" firstStartedPulling="2026-04-20 23:13:05.552086178 +0000 UTC m=+65.496795608" lastFinishedPulling="2026-04-20 23:13:07.495211105 +0000 UTC m=+67.439920551" observedRunningTime="2026-04-20 23:13:07.936324386 +0000 UTC m=+67.881033839" watchObservedRunningTime="2026-04-20 23:13:07.936654562 +0000 UTC m=+67.881364014" Apr 20 23:13:08.841845 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.841807 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46"] Apr 20 23:13:08.856201 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.856172 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24"] Apr 20 23:13:08.856367 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.856241 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" Apr 20 23:13:08.858862 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.858835 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-cxdtb\"" Apr 20 23:13:08.859000 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.858966 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 23:13:08.859503 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.859481 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:13:08.859693 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.859646 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 23:13:08.872216 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.872189 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xwfnk"] Apr 20 23:13:08.872363 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.872341 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:08.874925 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.874903 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 23:13:08.875272 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.875250 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 23:13:08.875369 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.875356 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 23:13:08.875440 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.875359 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 23:13:08.890356 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.890331 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv"] Apr 20 23:13:08.890485 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.890465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xwfnk" Apr 20 23:13:08.892711 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.892695 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-58w7r\"" Apr 20 23:13:08.892811 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.892716 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 23:13:08.892811 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.892718 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:13:08.909751 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.909732 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq"] Apr 20 23:13:08.910049 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.909844 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv" Apr 20 23:13:08.912278 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.912262 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-kjzvd\"" Apr 20 23:13:08.912371 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.912349 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 23:13:08.927341 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.927319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hjcx\" (UniqueName: \"kubernetes.io/projected/d3f6f889-19ce-4920-b698-e386acd26110-kube-api-access-2hjcx\") pod \"klusterlet-addon-workmgr-5ddcfbbf7-vxn24\" (UID: \"d3f6f889-19ce-4920-b698-e386acd26110\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:08.927422 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.927358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx7ls\" (UniqueName: \"kubernetes.io/projected/2d983d22-a24e-455c-a295-79be2cc1bbf6-kube-api-access-hx7ls\") pod \"cluster-samples-operator-6dc5bdb6b4-6zx46\" (UID: \"2d983d22-a24e-455c-a295-79be2cc1bbf6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" Apr 20 23:13:08.927422 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.927392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d3f6f889-19ce-4920-b698-e386acd26110-klusterlet-config\") pod \"klusterlet-addon-workmgr-5ddcfbbf7-vxn24\" (UID: \"d3f6f889-19ce-4920-b698-e386acd26110\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:08.927498 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.927436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d983d22-a24e-455c-a295-79be2cc1bbf6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6zx46\" (UID: \"2d983d22-a24e-455c-a295-79be2cc1bbf6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" Apr 20 23:13:08.927498 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.927480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3f6f889-19ce-4920-b698-e386acd26110-tmp\") pod \"klusterlet-addon-workmgr-5ddcfbbf7-vxn24\" (UID: \"d3f6f889-19ce-4920-b698-e386acd26110\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:08.931202 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.931182 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46"] Apr 20 23:13:08.931315 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.931209 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24"] Apr 20 23:13:08.931315 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.931221 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xwfnk"] Apr 20 23:13:08.931315 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.931232 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv"] Apr 20 23:13:08.931315 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.931243 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq"] Apr 20 23:13:08.931481 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.931453 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:08.933750 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.933734 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 23:13:08.933974 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.933960 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 23:13:08.934321 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.934309 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 23:13:08.935376 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.935363 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 23:13:08.989311 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:08.989278 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d764h"] Apr 20 23:13:09.000377 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.000349 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg"] Apr 20 23:13:09.000520 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.000506 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.004591 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.004571 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 23:13:09.004814 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.004799 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 23:13:09.004877 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.004862 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 23:13:09.004925 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.004804 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-bkvhb\"" Apr 20 23:13:09.005397 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.005383 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 23:13:09.011167 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.011124 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 23:13:09.012914 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.012899 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8w8m8"] Apr 20 23:13:09.013041 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.013028 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" Apr 20 23:13:09.022418 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.022396 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 23:13:09.022641 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.022444 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 23:13:09.022727 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.022637 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 23:13:09.022787 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.022452 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 23:13:09.023422 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.023088 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-799dc56db8-g6d5c"] Apr 20 23:13:09.023706 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.023664 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-mxhdd\"" Apr 20 23:13:09.023935 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.023914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.027222 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.027201 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 23:13:09.027332 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.027267 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 23:13:09.027393 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.027352 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 23:13:09.027393 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.027367 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-tqw9h\"" Apr 20 23:13:09.027512 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.027481 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:13:09.027857 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.027833 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/62634ee5-058f-4d46-b82d-dd9f66d64c4a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6c6d98c679-rmhvv\" (UID: \"62634ee5-058f-4d46-b82d-dd9f66d64c4a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv" Apr 20 23:13:09.027953 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.027908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hx7ls\" (UniqueName: \"kubernetes.io/projected/2d983d22-a24e-455c-a295-79be2cc1bbf6-kube-api-access-hx7ls\") pod \"cluster-samples-operator-6dc5bdb6b4-6zx46\" (UID: \"2d983d22-a24e-455c-a295-79be2cc1bbf6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" Apr 20 23:13:09.028193 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d3f6f889-19ce-4920-b698-e386acd26110-klusterlet-config\") pod \"klusterlet-addon-workmgr-5ddcfbbf7-vxn24\" (UID: \"d3f6f889-19ce-4920-b698-e386acd26110\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:09.028193 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028186 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-ca\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.028354 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.028354 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d983d22-a24e-455c-a295-79be2cc1bbf6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6zx46\" (UID: \"2d983d22-a24e-455c-a295-79be2cc1bbf6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" Apr 20 23:13:09.028454 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3f6f889-19ce-4920-b698-e386acd26110-tmp\") pod \"klusterlet-addon-workmgr-5ddcfbbf7-vxn24\" (UID: \"d3f6f889-19ce-4920-b698-e386acd26110\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:09.028617 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.028703 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-hub\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.028703 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/89cf5f9d-c0a2-489e-996c-ec985537fa6d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.028703 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwc55\" (UniqueName: \"kubernetes.io/projected/c293a8dc-a84c-49ef-a91c-fc1f64604bbe-kube-api-access-kwc55\") pod \"volume-data-source-validator-7c6cbb6c87-xwfnk\" (UID: \"c293a8dc-a84c-49ef-a91c-fc1f64604bbe\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xwfnk" Apr 20 23:13:09.028854 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvbs\" (UniqueName: \"kubernetes.io/projected/62634ee5-058f-4d46-b82d-dd9f66d64c4a-kube-api-access-gzvbs\") pod \"managed-serviceaccount-addon-agent-6c6d98c679-rmhvv\" (UID: \"62634ee5-058f-4d46-b82d-dd9f66d64c4a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv" Apr 20 23:13:09.028854 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27wkn\" (UniqueName: \"kubernetes.io/projected/89cf5f9d-c0a2-489e-996c-ec985537fa6d-kube-api-access-27wkn\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.029270 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.028847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hjcx\" (UniqueName: \"kubernetes.io/projected/d3f6f889-19ce-4920-b698-e386acd26110-kube-api-access-2hjcx\") pod \"klusterlet-addon-workmgr-5ddcfbbf7-vxn24\" (UID: \"d3f6f889-19ce-4920-b698-e386acd26110\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:09.029270 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.029115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3f6f889-19ce-4920-b698-e386acd26110-tmp\") pod \"klusterlet-addon-workmgr-5ddcfbbf7-vxn24\" (UID: \"d3f6f889-19ce-4920-b698-e386acd26110\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:09.031113 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.031080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d983d22-a24e-455c-a295-79be2cc1bbf6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6zx46\" (UID: \"2d983d22-a24e-455c-a295-79be2cc1bbf6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" Apr 20 23:13:09.031616 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.031570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d3f6f889-19ce-4920-b698-e386acd26110-klusterlet-config\") pod \"klusterlet-addon-workmgr-5ddcfbbf7-vxn24\" (UID: \"d3f6f889-19ce-4920-b698-e386acd26110\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:09.037525 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.037503 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 23:13:09.037809 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.037792 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d764h"] Apr 20 23:13:09.037871 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.037817 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8w8m8"] Apr 20 23:13:09.037871 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.037829 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg"] Apr 20 23:13:09.037937 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.037916 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.040899 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.040861 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-tdk2x\"" Apr 20 23:13:09.040899 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.040877 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 23:13:09.042449 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.042430 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 23:13:09.042551 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.042461 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 23:13:09.042646 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.042462 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 23:13:09.042846 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.042825 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 23:13:09.043051 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.043035 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 23:13:09.044822 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.044804 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-799dc56db8-g6d5c"] Apr 20 23:13:09.052287 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.051329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx7ls\" (UniqueName: \"kubernetes.io/projected/2d983d22-a24e-455c-a295-79be2cc1bbf6-kube-api-access-hx7ls\") pod \"cluster-samples-operator-6dc5bdb6b4-6zx46\" (UID: \"2d983d22-a24e-455c-a295-79be2cc1bbf6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" Apr 20 23:13:09.054825 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.054804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hjcx\" (UniqueName: \"kubernetes.io/projected/d3f6f889-19ce-4920-b698-e386acd26110-kube-api-access-2hjcx\") pod \"klusterlet-addon-workmgr-5ddcfbbf7-vxn24\" (UID: \"d3f6f889-19ce-4920-b698-e386acd26110\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:09.131283 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-ca\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.131392 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.131392 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czz8s\" (UniqueName: \"kubernetes.io/projected/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-kube-api-access-czz8s\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.131392 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cmln\" (UniqueName: \"kubernetes.io/projected/dc35f408-9d03-4b9f-b1fd-a285d5b9d26b-kube-api-access-6cmln\") pod \"cluster-monitoring-operator-75587bd455-fm6cg\" (UID: \"dc35f408-9d03-4b9f-b1fd-a285d5b9d26b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" Apr 20 23:13:09.131392 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131385 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0337f295-0371-4489-b8f0-7d5728373a37-snapshots\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.131574 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0337f295-0371-4489-b8f0-7d5728373a37-tmp\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.131574 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc35f408-9d03-4b9f-b1fd-a285d5b9d26b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fm6cg\" (UID: \"dc35f408-9d03-4b9f-b1fd-a285d5b9d26b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" Apr 20 23:13:09.131672 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0337f295-0371-4489-b8f0-7d5728373a37-service-ca-bundle\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.131672 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.131672 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0337f295-0371-4489-b8f0-7d5728373a37-serving-cert\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.131816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-hub\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.131969 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.131945 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/89cf5f9d-c0a2-489e-996c-ec985537fa6d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.132271 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwc55\" (UniqueName: \"kubernetes.io/projected/c293a8dc-a84c-49ef-a91c-fc1f64604bbe-kube-api-access-kwc55\") pod \"volume-data-source-validator-7c6cbb6c87-xwfnk\" (UID: \"c293a8dc-a84c-49ef-a91c-fc1f64604bbe\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xwfnk" Apr 20 23:13:09.132368 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/dc35f408-9d03-4b9f-b1fd-a285d5b9d26b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fm6cg\" (UID: \"dc35f408-9d03-4b9f-b1fd-a285d5b9d26b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" Apr 20 23:13:09.132368 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/923ad550-9c0c-4c01-bd51-00fb990a6a8d-stats-auth\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.132472 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvbs\" (UniqueName: \"kubernetes.io/projected/62634ee5-058f-4d46-b82d-dd9f66d64c4a-kube-api-access-gzvbs\") pod \"managed-serviceaccount-addon-agent-6c6d98c679-rmhvv\" (UID: \"62634ee5-058f-4d46-b82d-dd9f66d64c4a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv" Apr 20 23:13:09.132472 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/923ad550-9c0c-4c01-bd51-00fb990a6a8d-default-certificate\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.132472 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923ad550-9c0c-4c01-bd51-00fb990a6a8d-metrics-certs\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.132472 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-serving-cert\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.132674 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27wkn\" (UniqueName: \"kubernetes.io/projected/89cf5f9d-c0a2-489e-996c-ec985537fa6d-kube-api-access-27wkn\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.132674 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/0337f295-0371-4489-b8f0-7d5728373a37-kube-api-access-5x728\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.132674 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-trusted-ca\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.132674 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0337f295-0371-4489-b8f0-7d5728373a37-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.132868 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-config\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.132868 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46tss\" (UniqueName: \"kubernetes.io/projected/923ad550-9c0c-4c01-bd51-00fb990a6a8d-kube-api-access-46tss\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.132967 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/62634ee5-058f-4d46-b82d-dd9f66d64c4a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6c6d98c679-rmhvv\" (UID: \"62634ee5-058f-4d46-b82d-dd9f66d64c4a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv" Apr 20 23:13:09.132967 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.132948 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/923ad550-9c0c-4c01-bd51-00fb990a6a8d-service-ca-bundle\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.133633 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.133598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/89cf5f9d-c0a2-489e-996c-ec985537fa6d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.136223 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.134897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-ca\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.136223 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.134967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.136223 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.135453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-hub\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.136223 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.135888 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/89cf5f9d-c0a2-489e-996c-ec985537fa6d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.136223 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.136010 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/62634ee5-058f-4d46-b82d-dd9f66d64c4a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6c6d98c679-rmhvv\" (UID: \"62634ee5-058f-4d46-b82d-dd9f66d64c4a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv" Apr 20 23:13:09.157777 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.157427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27wkn\" (UniqueName: \"kubernetes.io/projected/89cf5f9d-c0a2-489e-996c-ec985537fa6d-kube-api-access-27wkn\") pod \"cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq\" (UID: \"89cf5f9d-c0a2-489e-996c-ec985537fa6d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.157777 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.157728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwc55\" (UniqueName: \"kubernetes.io/projected/c293a8dc-a84c-49ef-a91c-fc1f64604bbe-kube-api-access-kwc55\") pod \"volume-data-source-validator-7c6cbb6c87-xwfnk\" (UID: \"c293a8dc-a84c-49ef-a91c-fc1f64604bbe\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xwfnk" Apr 20 23:13:09.158019 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.157995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvbs\" (UniqueName: \"kubernetes.io/projected/62634ee5-058f-4d46-b82d-dd9f66d64c4a-kube-api-access-gzvbs\") pod \"managed-serviceaccount-addon-agent-6c6d98c679-rmhvv\" (UID: \"62634ee5-058f-4d46-b82d-dd9f66d64c4a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv" Apr 20 23:13:09.169700 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.169672 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" Apr 20 23:13:09.182976 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.182805 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:09.199552 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.199516 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xwfnk" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.236948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czz8s\" (UniqueName: \"kubernetes.io/projected/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-kube-api-access-czz8s\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.236992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cmln\" (UniqueName: \"kubernetes.io/projected/dc35f408-9d03-4b9f-b1fd-a285d5b9d26b-kube-api-access-6cmln\") pod \"cluster-monitoring-operator-75587bd455-fm6cg\" (UID: \"dc35f408-9d03-4b9f-b1fd-a285d5b9d26b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0337f295-0371-4489-b8f0-7d5728373a37-snapshots\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0337f295-0371-4489-b8f0-7d5728373a37-tmp\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc35f408-9d03-4b9f-b1fd-a285d5b9d26b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fm6cg\" (UID: \"dc35f408-9d03-4b9f-b1fd-a285d5b9d26b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0337f295-0371-4489-b8f0-7d5728373a37-service-ca-bundle\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0337f295-0371-4489-b8f0-7d5728373a37-serving-cert\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/dc35f408-9d03-4b9f-b1fd-a285d5b9d26b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fm6cg\" (UID: \"dc35f408-9d03-4b9f-b1fd-a285d5b9d26b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/923ad550-9c0c-4c01-bd51-00fb990a6a8d-stats-auth\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/923ad550-9c0c-4c01-bd51-00fb990a6a8d-default-certificate\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923ad550-9c0c-4c01-bd51-00fb990a6a8d-metrics-certs\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-serving-cert\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/0337f295-0371-4489-b8f0-7d5728373a37-kube-api-access-5x728\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-trusted-ca\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0337f295-0371-4489-b8f0-7d5728373a37-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.238407 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-config\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.239219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46tss\" (UniqueName: \"kubernetes.io/projected/923ad550-9c0c-4c01-bd51-00fb990a6a8d-kube-api-access-46tss\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.239219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.237551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/923ad550-9c0c-4c01-bd51-00fb990a6a8d-service-ca-bundle\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.239219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.238377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/923ad550-9c0c-4c01-bd51-00fb990a6a8d-service-ca-bundle\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.241381 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.239447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0337f295-0371-4489-b8f0-7d5728373a37-snapshots\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.241381 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.239703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0337f295-0371-4489-b8f0-7d5728373a37-tmp\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.246099 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.245549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc35f408-9d03-4b9f-b1fd-a285d5b9d26b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fm6cg\" (UID: \"dc35f408-9d03-4b9f-b1fd-a285d5b9d26b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" Apr 20 23:13:09.268662 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.259889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923ad550-9c0c-4c01-bd51-00fb990a6a8d-metrics-certs\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.268662 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.260478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" Apr 20 23:13:09.268662 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.262700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/dc35f408-9d03-4b9f-b1fd-a285d5b9d26b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fm6cg\" (UID: \"dc35f408-9d03-4b9f-b1fd-a285d5b9d26b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" Apr 20 23:13:09.268662 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.263935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv" Apr 20 23:13:09.268662 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.264570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0337f295-0371-4489-b8f0-7d5728373a37-service-ca-bundle\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.268662 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.265718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-trusted-ca\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.268662 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.266687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-config\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.268662 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.267467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0337f295-0371-4489-b8f0-7d5728373a37-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.279884 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.271539 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-serving-cert\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.279884 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.277886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0337f295-0371-4489-b8f0-7d5728373a37-serving-cert\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.279884 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.278971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czz8s\" (UniqueName: \"kubernetes.io/projected/ee9b8fd0-f305-4d93-a360-f2430b0b36fb-kube-api-access-czz8s\") pod \"console-operator-9d4b6777b-8w8m8\" (UID: \"ee9b8fd0-f305-4d93-a360-f2430b0b36fb\") " pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.279884 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.279030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/923ad550-9c0c-4c01-bd51-00fb990a6a8d-stats-auth\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.280239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.279961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cmln\" (UniqueName: \"kubernetes.io/projected/dc35f408-9d03-4b9f-b1fd-a285d5b9d26b-kube-api-access-6cmln\") pod \"cluster-monitoring-operator-75587bd455-fm6cg\" (UID: \"dc35f408-9d03-4b9f-b1fd-a285d5b9d26b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" Apr 20 23:13:09.280239 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.280125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/923ad550-9c0c-4c01-bd51-00fb990a6a8d-default-certificate\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.282110 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.282068 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/0337f295-0371-4489-b8f0-7d5728373a37-kube-api-access-5x728\") pod \"insights-operator-585dfdc468-d764h\" (UID: \"0337f295-0371-4489-b8f0-7d5728373a37\") " pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.286172 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.284901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46tss\" (UniqueName: \"kubernetes.io/projected/923ad550-9c0c-4c01-bd51-00fb990a6a8d-kube-api-access-46tss\") pod \"router-default-799dc56db8-g6d5c\" (UID: \"923ad550-9c0c-4c01-bd51-00fb990a6a8d\") " pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.328969 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.328935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-d764h" Apr 20 23:13:09.346291 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.345606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" Apr 20 23:13:09.353810 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.352642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:09.379925 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.378402 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:09.379925 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.378891 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46"] Apr 20 23:13:09.400273 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.400243 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24"] Apr 20 23:13:09.409396 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:09.408904 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3f6f889_19ce_4920_b698_e386acd26110.slice/crio-05d282aaf1e76aedc213dc479f5f560b07b6db929ea8028488997011b2c12231 WatchSource:0}: Error finding container 05d282aaf1e76aedc213dc479f5f560b07b6db929ea8028488997011b2c12231: Status 404 returned error can't find the container with id 05d282aaf1e76aedc213dc479f5f560b07b6db929ea8028488997011b2c12231 Apr 20 23:13:09.528665 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.522119 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq"] Apr 20 23:13:09.528665 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.523954 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv"] Apr 20 23:13:09.538315 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:09.537327 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89cf5f9d_c0a2_489e_996c_ec985537fa6d.slice/crio-a286cbdd781fa7d67ea6d49ab3aa5bfe5da38ea6a17bc13fa4949158847a3c73 WatchSource:0}: Error finding container a286cbdd781fa7d67ea6d49ab3aa5bfe5da38ea6a17bc13fa4949158847a3c73: Status 404 returned error can't find the container with id a286cbdd781fa7d67ea6d49ab3aa5bfe5da38ea6a17bc13fa4949158847a3c73 Apr 20 23:13:09.590974 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.590951 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8w8m8"] Apr 20 23:13:09.595083 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:09.595033 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee9b8fd0_f305_4d93_a360_f2430b0b36fb.slice/crio-d98529cf159b18f5060b26c723be06978927188abdf386b7500e8aada7d1453b WatchSource:0}: Error finding container d98529cf159b18f5060b26c723be06978927188abdf386b7500e8aada7d1453b: Status 404 returned error can't find the container with id d98529cf159b18f5060b26c723be06978927188abdf386b7500e8aada7d1453b Apr 20 23:13:09.613094 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.612975 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d764h"] Apr 20 23:13:09.615597 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:09.615570 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0337f295_0371_4489_b8f0_7d5728373a37.slice/crio-5e5b7ab305bcf1da79e6429c798719720b66d8eb86a599bd500551a2cab55181 WatchSource:0}: Error finding container 5e5b7ab305bcf1da79e6429c798719720b66d8eb86a599bd500551a2cab55181: Status 404 returned error can't find the container with id 5e5b7ab305bcf1da79e6429c798719720b66d8eb86a599bd500551a2cab55181 Apr 20 23:13:09.643985 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.643960 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xwfnk"] Apr 20 23:13:09.646429 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:09.646404 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc293a8dc_a84c_49ef_a91c_fc1f64604bbe.slice/crio-272dbff249d6a51e179dc6fb92018221fab05a4073a922d440040be7918800a7 WatchSource:0}: Error finding container 272dbff249d6a51e179dc6fb92018221fab05a4073a922d440040be7918800a7: Status 404 returned error can't find the container with id 272dbff249d6a51e179dc6fb92018221fab05a4073a922d440040be7918800a7 Apr 20 23:13:09.841227 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.841196 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg"] Apr 20 23:13:09.844295 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:09.844266 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc35f408_9d03_4b9f_b1fd_a285d5b9d26b.slice/crio-e05ef22ee230230b0bfd15352ee3e9f6502df2607de5e86437c10045e6a1eac9 WatchSource:0}: Error finding container e05ef22ee230230b0bfd15352ee3e9f6502df2607de5e86437c10045e6a1eac9: Status 404 returned error can't find the container with id e05ef22ee230230b0bfd15352ee3e9f6502df2607de5e86437c10045e6a1eac9 Apr 20 23:13:09.845103 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.845078 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-799dc56db8-g6d5c"] Apr 20 23:13:09.850222 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:09.850200 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod923ad550_9c0c_4c01_bd51_00fb990a6a8d.slice/crio-25fa36fba8275f395aaa4e8beaaa1ce63004c87a75d2935b42357084f921e364 WatchSource:0}: Error finding container 25fa36fba8275f395aaa4e8beaaa1ce63004c87a75d2935b42357084f921e364: Status 404 returned error can't find the container with id 25fa36fba8275f395aaa4e8beaaa1ce63004c87a75d2935b42357084f921e364 Apr 20 23:13:09.919070 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.919037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rvb5h" event={"ID":"e6e1d353-f530-4ad5-a0ae-b436e227eb58","Type":"ContainerStarted","Data":"7590c98c20325925c564d1e4c3b9b112d547ab0185b63625e18584dade063b8b"} Apr 20 23:13:09.919453 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.919080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rvb5h" event={"ID":"e6e1d353-f530-4ad5-a0ae-b436e227eb58","Type":"ContainerStarted","Data":"9490971cb557c608115c08a163b99027ecf52bbce7760fac26275871ff6c3fe2"} Apr 20 23:13:09.920152 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.920109 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" event={"ID":"ee9b8fd0-f305-4d93-a360-f2430b0b36fb","Type":"ContainerStarted","Data":"d98529cf159b18f5060b26c723be06978927188abdf386b7500e8aada7d1453b"} Apr 20 23:13:09.921178 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.921133 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" event={"ID":"dc35f408-9d03-4b9f-b1fd-a285d5b9d26b","Type":"ContainerStarted","Data":"e05ef22ee230230b0bfd15352ee3e9f6502df2607de5e86437c10045e6a1eac9"} Apr 20 23:13:09.922081 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.922057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d764h" event={"ID":"0337f295-0371-4489-b8f0-7d5728373a37","Type":"ContainerStarted","Data":"5e5b7ab305bcf1da79e6429c798719720b66d8eb86a599bd500551a2cab55181"} Apr 20 23:13:09.922969 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.922948 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" event={"ID":"d3f6f889-19ce-4920-b698-e386acd26110","Type":"ContainerStarted","Data":"05d282aaf1e76aedc213dc479f5f560b07b6db929ea8028488997011b2c12231"} Apr 20 23:13:09.923850 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.923829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" event={"ID":"2d983d22-a24e-455c-a295-79be2cc1bbf6","Type":"ContainerStarted","Data":"87d981157e2ec3077a6756e9388ab13ba96d111c988ee1ef4556693e6ce38f7c"} Apr 20 23:13:09.924745 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.924723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xwfnk" event={"ID":"c293a8dc-a84c-49ef-a91c-fc1f64604bbe","Type":"ContainerStarted","Data":"272dbff249d6a51e179dc6fb92018221fab05a4073a922d440040be7918800a7"} Apr 20 23:13:09.925573 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.925552 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv" event={"ID":"62634ee5-058f-4d46-b82d-dd9f66d64c4a","Type":"ContainerStarted","Data":"2a923cca9819e4c03b1af5851cde42dec1f4fe89a9a2468cbd0cf697e032c5c2"} Apr 20 23:13:09.926399 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.926383 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-799dc56db8-g6d5c" event={"ID":"923ad550-9c0c-4c01-bd51-00fb990a6a8d","Type":"ContainerStarted","Data":"25fa36fba8275f395aaa4e8beaaa1ce63004c87a75d2935b42357084f921e364"} Apr 20 23:13:09.927187 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.927169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" event={"ID":"89cf5f9d-c0a2-489e-996c-ec985537fa6d","Type":"ContainerStarted","Data":"a286cbdd781fa7d67ea6d49ab3aa5bfe5da38ea6a17bc13fa4949158847a3c73"} Apr 20 23:13:09.935634 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:09.935586 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rvb5h" podStartSLOduration=67.479467525 podStartE2EDuration="1m8.93557115s" podCreationTimestamp="2026-04-20 23:12:01 +0000 UTC" firstStartedPulling="2026-04-20 23:13:07.629083802 +0000 UTC m=+67.573793244" lastFinishedPulling="2026-04-20 23:13:09.085187434 +0000 UTC m=+69.029896869" observedRunningTime="2026-04-20 23:13:09.93468328 +0000 UTC m=+69.879392732" watchObservedRunningTime="2026-04-20 23:13:09.93557115 +0000 UTC m=+69.880280603" Apr 20 23:13:10.959119 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:10.959078 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-799dc56db8-g6d5c" event={"ID":"923ad550-9c0c-4c01-bd51-00fb990a6a8d","Type":"ContainerStarted","Data":"51a4036a8d804c44bc1ab92d6a0019ade2bccb585ce0c860c22a7968c54391a3"} Apr 20 23:13:11.379555 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:11.379496 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:11.383208 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:11.383151 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:11.403458 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:11.402728 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-799dc56db8-g6d5c" podStartSLOduration=3.402707412 podStartE2EDuration="3.402707412s" podCreationTimestamp="2026-04-20 23:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:13:10.98911907 +0000 UTC m=+70.933828522" watchObservedRunningTime="2026-04-20 23:13:11.402707412 +0000 UTC m=+71.347416866" Apr 20 23:13:11.967804 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:11.967773 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:11.969330 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:11.969130 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-799dc56db8-g6d5c" Apr 20 23:13:13.832765 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:13.832720 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-plw6c" Apr 20 23:13:17.934283 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:17.934248 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dzpv8" Apr 20 23:13:19.994188 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:19.993922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" event={"ID":"ee9b8fd0-f305-4d93-a360-f2430b0b36fb","Type":"ContainerStarted","Data":"95120bb7363fa8506fa96ccd1d24ea93ef8d270dd1dae504df815b21ff963d12"} Apr 20 23:13:19.994999 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:19.994968 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:19.996702 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:19.996667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" event={"ID":"dc35f408-9d03-4b9f-b1fd-a285d5b9d26b","Type":"ContainerStarted","Data":"31577553925acdff7041d4ecda645c30b3dad9dce8e208a25b70232757ab1862"} Apr 20 23:13:19.998325 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:19.998286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d764h" event={"ID":"0337f295-0371-4489-b8f0-7d5728373a37","Type":"ContainerStarted","Data":"62c7cbff9537b524d53a1b4de5cac15dfb69112c18ab9622257615ed7b6fd4c6"} Apr 20 23:13:20.001506 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.001458 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" Apr 20 23:13:20.002366 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.001975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" event={"ID":"d3f6f889-19ce-4920-b698-e386acd26110","Type":"ContainerStarted","Data":"8363473feb9818ab07594338174a054345a97d0cf1e35937569bd2978cd3de0d"} Apr 20 23:13:20.002522 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.002500 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:20.004305 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.004282 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" Apr 20 23:13:20.005181 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.005108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" event={"ID":"2d983d22-a24e-455c-a295-79be2cc1bbf6","Type":"ContainerStarted","Data":"2f529d36e03b39b098c5f0d7b81acbc2d70dce190eb50442ca40d8202d4139a2"} Apr 20 23:13:20.005181 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.005158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" event={"ID":"2d983d22-a24e-455c-a295-79be2cc1bbf6","Type":"ContainerStarted","Data":"9f8ac5fabb79d54926d3fba12400eb592ef15c3c6c62a18ff007d9748dc3e044"} Apr 20 23:13:20.006711 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.006682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xwfnk" event={"ID":"c293a8dc-a84c-49ef-a91c-fc1f64604bbe","Type":"ContainerStarted","Data":"3476e19b50a70208d719ee1d9eb79ea7eaad7677e0919cc34a329610a6a417b8"} Apr 20 23:13:20.008379 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.008063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv" event={"ID":"62634ee5-058f-4d46-b82d-dd9f66d64c4a","Type":"ContainerStarted","Data":"a3913159dae17cc56aeab95c386138ee547cbde4d25bec7cb55d27c6812c143e"} Apr 20 23:13:20.010706 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.010680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" event={"ID":"89cf5f9d-c0a2-489e-996c-ec985537fa6d","Type":"ContainerStarted","Data":"c193ff77ad51462bb508f3d420c2cb120f0085063a5708bb1da0d146ba6a9b4c"} Apr 20 23:13:20.013834 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.013788 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-8w8m8" podStartSLOduration=2.628187556 podStartE2EDuration="12.013772398s" podCreationTimestamp="2026-04-20 23:13:08 +0000 UTC" firstStartedPulling="2026-04-20 23:13:09.598786817 +0000 UTC m=+69.543496261" lastFinishedPulling="2026-04-20 23:13:18.984371667 +0000 UTC m=+78.929081103" observedRunningTime="2026-04-20 23:13:20.012273518 +0000 UTC m=+79.956982982" watchObservedRunningTime="2026-04-20 23:13:20.013772398 +0000 UTC m=+79.958481851" Apr 20 23:13:20.036112 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.032892 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fm6cg" podStartSLOduration=2.8926971999999997 podStartE2EDuration="12.032874283s" podCreationTimestamp="2026-04-20 23:13:08 +0000 UTC" firstStartedPulling="2026-04-20 23:13:09.845989542 +0000 UTC m=+69.790698972" lastFinishedPulling="2026-04-20 23:13:18.986166612 +0000 UTC m=+78.930876055" observedRunningTime="2026-04-20 23:13:20.031077354 +0000 UTC m=+79.975786805" watchObservedRunningTime="2026-04-20 23:13:20.032874283 +0000 UTC m=+79.977583736" Apr 20 23:13:20.054955 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.053527 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-d764h" podStartSLOduration=2.686849288 podStartE2EDuration="12.053509812s" podCreationTimestamp="2026-04-20 23:13:08 +0000 UTC" firstStartedPulling="2026-04-20 23:13:09.617753781 +0000 UTC m=+69.562463227" lastFinishedPulling="2026-04-20 23:13:18.984414314 +0000 UTC m=+78.929123751" observedRunningTime="2026-04-20 23:13:20.052177197 +0000 UTC m=+79.996886649" watchObservedRunningTime="2026-04-20 23:13:20.053509812 +0000 UTC m=+79.998219277" Apr 20 23:13:20.099543 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.099019 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5ddcfbbf7-vxn24" podStartSLOduration=2.523209211 podStartE2EDuration="12.098971299s" podCreationTimestamp="2026-04-20 23:13:08 +0000 UTC" firstStartedPulling="2026-04-20 23:13:09.420693917 +0000 UTC m=+69.365403351" lastFinishedPulling="2026-04-20 23:13:18.996455994 +0000 UTC m=+78.941165439" observedRunningTime="2026-04-20 23:13:20.097555986 +0000 UTC m=+80.042265449" watchObservedRunningTime="2026-04-20 23:13:20.098971299 +0000 UTC m=+80.043680754" Apr 20 23:13:20.116603 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.116471 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6zx46" podStartSLOduration=2.637472169 podStartE2EDuration="12.116449915s" podCreationTimestamp="2026-04-20 23:13:08 +0000 UTC" firstStartedPulling="2026-04-20 23:13:09.491024525 +0000 UTC m=+69.435733955" lastFinishedPulling="2026-04-20 23:13:18.970002271 +0000 UTC m=+78.914711701" observedRunningTime="2026-04-20 23:13:20.113736396 +0000 UTC m=+80.058445848" watchObservedRunningTime="2026-04-20 23:13:20.116449915 +0000 UTC m=+80.061159367" Apr 20 23:13:20.131288 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.131238 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c6d98c679-rmhvv" podStartSLOduration=2.688933153 podStartE2EDuration="12.131225204s" podCreationTimestamp="2026-04-20 23:13:08 +0000 UTC" firstStartedPulling="2026-04-20 23:13:09.5490268 +0000 UTC m=+69.493736233" lastFinishedPulling="2026-04-20 23:13:18.99131884 +0000 UTC m=+78.936028284" observedRunningTime="2026-04-20 23:13:20.128921659 +0000 UTC m=+80.073631110" watchObservedRunningTime="2026-04-20 23:13:20.131225204 +0000 UTC m=+80.075934656" Apr 20 23:13:20.146737 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:20.146678 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xwfnk" podStartSLOduration=2.811155078 podStartE2EDuration="12.146660005s" podCreationTimestamp="2026-04-20 23:13:08 +0000 UTC" firstStartedPulling="2026-04-20 23:13:09.648910575 +0000 UTC m=+69.593620004" lastFinishedPulling="2026-04-20 23:13:18.9844155 +0000 UTC m=+78.929124931" observedRunningTime="2026-04-20 23:13:20.145745479 +0000 UTC m=+80.090454922" watchObservedRunningTime="2026-04-20 23:13:20.146660005 +0000 UTC m=+80.091369459" Apr 20 23:13:21.099162 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:21.099118 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dzpv8_40424f06-d2df-4a03-bc2c-0af9d8b4e184/dns/0.log" Apr 20 23:13:21.282510 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:21.282483 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dzpv8_40424f06-d2df-4a03-bc2c-0af9d8b4e184/kube-rbac-proxy/0.log" Apr 20 23:13:21.481793 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:21.481717 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jsb2w_392462d6-20a2-4842-bcbc-129ba91961ef/dns-node-resolver/0.log" Apr 20 23:13:22.019310 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.019272 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" event={"ID":"89cf5f9d-c0a2-489e-996c-ec985537fa6d","Type":"ContainerStarted","Data":"b614071c6626af1dd587324f254c7e234e9e06d9f775542e717835174a36b134"} Apr 20 23:13:22.019310 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.019312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" event={"ID":"89cf5f9d-c0a2-489e-996c-ec985537fa6d","Type":"ContainerStarted","Data":"fd483ea3d9e442ec0b5a8d7190553a2acb6f6f754a7551adee96b09f171446a7"} Apr 20 23:13:22.037603 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.037553 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f4f5b7dc7-nr4jq" podStartSLOduration=2.498151724 podStartE2EDuration="14.037539543s" podCreationTimestamp="2026-04-20 23:13:08 +0000 UTC" firstStartedPulling="2026-04-20 23:13:09.55051135 +0000 UTC m=+69.495220791" lastFinishedPulling="2026-04-20 23:13:21.089899178 +0000 UTC m=+81.034608610" observedRunningTime="2026-04-20 23:13:22.036953707 +0000 UTC m=+81.981663158" watchObservedRunningTime="2026-04-20 23:13:22.037539543 +0000 UTC m=+81.982249030" Apr 20 23:13:22.281759 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.281683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6c46bdb5c9-xflgd_e672b75e-249e-4dd5-928c-641987109f81/registry/0.log" Apr 20 23:13:22.331602 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.331565 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4szd2"] Apr 20 23:13:22.335250 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.335223 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.337663 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.337639 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 23:13:22.338178 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.338155 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-clbx8\"" Apr 20 23:13:22.338356 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.338341 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 23:13:22.346315 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.346292 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4szd2"] Apr 20 23:13:22.457435 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.457407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/227b4df1-51fd-4833-bed6-8b6c220bea00-crio-socket\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.457625 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.457443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/227b4df1-51fd-4833-bed6-8b6c220bea00-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.457625 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.457552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/227b4df1-51fd-4833-bed6-8b6c220bea00-data-volume\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.457625 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.457602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/227b4df1-51fd-4833-bed6-8b6c220bea00-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.457770 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.457667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drptb\" (UniqueName: \"kubernetes.io/projected/227b4df1-51fd-4833-bed6-8b6c220bea00-kube-api-access-drptb\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.558410 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.558316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/227b4df1-51fd-4833-bed6-8b6c220bea00-data-volume\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.558410 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.558367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/227b4df1-51fd-4833-bed6-8b6c220bea00-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.558644 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.558415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drptb\" (UniqueName: \"kubernetes.io/projected/227b4df1-51fd-4833-bed6-8b6c220bea00-kube-api-access-drptb\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.558644 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.558446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/227b4df1-51fd-4833-bed6-8b6c220bea00-crio-socket\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.558644 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.558475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/227b4df1-51fd-4833-bed6-8b6c220bea00-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.558644 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.558554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/227b4df1-51fd-4833-bed6-8b6c220bea00-crio-socket\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.558644 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:13:22.558611 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 23:13:22.558923 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.558677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/227b4df1-51fd-4833-bed6-8b6c220bea00-data-volume\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.558923 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:13:22.558684 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/227b4df1-51fd-4833-bed6-8b6c220bea00-insights-runtime-extractor-tls podName:227b4df1-51fd-4833-bed6-8b6c220bea00 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:23.058662831 +0000 UTC m=+83.003372277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/227b4df1-51fd-4833-bed6-8b6c220bea00-insights-runtime-extractor-tls") pod "insights-runtime-extractor-4szd2" (UID: "227b4df1-51fd-4833-bed6-8b6c220bea00") : secret "insights-runtime-extractor-tls" not found Apr 20 23:13:22.558923 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.558895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/227b4df1-51fd-4833-bed6-8b6c220bea00-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.569599 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.569566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drptb\" (UniqueName: \"kubernetes.io/projected/227b4df1-51fd-4833-bed6-8b6c220bea00-kube-api-access-drptb\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:22.881481 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:22.881452 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s6hls_1652a387-e617-4579-bb1d-4fab03dacaed/node-ca/0.log" Apr 20 23:13:23.063326 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:23.063293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/227b4df1-51fd-4833-bed6-8b6c220bea00-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:23.065583 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:23.065558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/227b4df1-51fd-4833-bed6-8b6c220bea00-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4szd2\" (UID: \"227b4df1-51fd-4833-bed6-8b6c220bea00\") " pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:23.081723 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:23.081703 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-799dc56db8-g6d5c_923ad550-9c0c-4c01-bd51-00fb990a6a8d/router/0.log" Apr 20 23:13:23.243828 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:23.243736 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4szd2" Apr 20 23:13:23.281923 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:23.281900 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2mlnp_204aec3c-9787-4646-abd7-68cf7063e0c5/serve-healthcheck-canary/0.log" Apr 20 23:13:23.362744 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:23.362712 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4szd2"] Apr 20 23:13:23.366071 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:23.366039 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod227b4df1_51fd_4833_bed6_8b6c220bea00.slice/crio-d87267e1b0af644e44d67e0264b464f6188e9bb59577f647819c311a7c23cffb WatchSource:0}: Error finding container d87267e1b0af644e44d67e0264b464f6188e9bb59577f647819c311a7c23cffb: Status 404 returned error can't find the container with id d87267e1b0af644e44d67e0264b464f6188e9bb59577f647819c311a7c23cffb Apr 20 23:13:24.026403 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:24.026355 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4szd2" event={"ID":"227b4df1-51fd-4833-bed6-8b6c220bea00","Type":"ContainerStarted","Data":"5a9f6330cf94c737b37f4e4e69f92e69ada6083b6692b3d9b04a8050a2864e48"} Apr 20 23:13:24.026403 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:24.026393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4szd2" event={"ID":"227b4df1-51fd-4833-bed6-8b6c220bea00","Type":"ContainerStarted","Data":"d87267e1b0af644e44d67e0264b464f6188e9bb59577f647819c311a7c23cffb"} Apr 20 23:13:25.032283 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:25.032246 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4szd2" event={"ID":"227b4df1-51fd-4833-bed6-8b6c220bea00","Type":"ContainerStarted","Data":"f1ba2c33977a0267ebfe830c3fc302f9732459331e0cf69e9e0ff8a09350dc90"} Apr 20 23:13:25.087794 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:25.087750 2576 patch_prober.go:28] interesting pod/image-registry-6c46bdb5c9-xflgd container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 23:13:25.087955 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:25.087819 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" podUID="e672b75e-249e-4dd5-928c-641987109f81" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 23:13:26.037206 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:26.037106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4szd2" event={"ID":"227b4df1-51fd-4833-bed6-8b6c220bea00","Type":"ContainerStarted","Data":"8be0bfa2c97727edd16e9625342f52ab8bd7e993cbcd8ad5e9d56db9b0342733"} Apr 20 23:13:26.057031 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:26.056984 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4szd2" podStartSLOduration=1.71816305 podStartE2EDuration="4.05697061s" podCreationTimestamp="2026-04-20 23:13:22 +0000 UTC" firstStartedPulling="2026-04-20 23:13:23.422614668 +0000 UTC m=+83.367324104" lastFinishedPulling="2026-04-20 23:13:25.761422232 +0000 UTC m=+85.706131664" observedRunningTime="2026-04-20 23:13:26.055547702 +0000 UTC m=+86.000257152" watchObservedRunningTime="2026-04-20 23:13:26.05697061 +0000 UTC m=+86.001680061" Apr 20 23:13:26.904775 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:26.904746 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:13:30.085206 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.085158 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dpkzg"] Apr 20 23:13:30.097730 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.097701 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.101180 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.101128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 23:13:30.101756 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.101719 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 23:13:30.101756 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.101731 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 23:13:30.102056 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.102041 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 23:13:30.102621 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.102603 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kdhph\"" Apr 20 23:13:30.114507 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.114467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02a27fe0-0637-4911-90ef-32cb592ad9f4-sys\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.114621 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.114547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.114621 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.114613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-textfile\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.114762 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.114641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-accelerators-collector-config\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.114762 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.114683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55jn8\" (UniqueName: \"kubernetes.io/projected/02a27fe0-0637-4911-90ef-32cb592ad9f4-kube-api-access-55jn8\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.114762 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.114709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-wtmp\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.114762 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.114739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/02a27fe0-0637-4911-90ef-32cb592ad9f4-root\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.114973 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.114775 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-tls\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.114973 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.114804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02a27fe0-0637-4911-90ef-32cb592ad9f4-metrics-client-ca\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.215919 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.215870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55jn8\" (UniqueName: \"kubernetes.io/projected/02a27fe0-0637-4911-90ef-32cb592ad9f4-kube-api-access-55jn8\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.215919 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.215919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-wtmp\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.216176 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.215958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/02a27fe0-0637-4911-90ef-32cb592ad9f4-root\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.216176 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.215990 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-tls\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.216176 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.216015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02a27fe0-0637-4911-90ef-32cb592ad9f4-metrics-client-ca\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.216176 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.216066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02a27fe0-0637-4911-90ef-32cb592ad9f4-sys\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.216176 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.216091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.216176 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.216159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-textfile\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.216505 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.216196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-accelerators-collector-config\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.216505 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.216259 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-wtmp\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.216505 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.216313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02a27fe0-0637-4911-90ef-32cb592ad9f4-sys\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.216718 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.216695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/02a27fe0-0637-4911-90ef-32cb592ad9f4-root\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.216970 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.216926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-textfile\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.217055 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.217016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-accelerators-collector-config\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.217055 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.217018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02a27fe0-0637-4911-90ef-32cb592ad9f4-metrics-client-ca\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.218909 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.218888 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-tls\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.219226 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.219206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02a27fe0-0637-4911-90ef-32cb592ad9f4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.225009 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.224989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55jn8\" (UniqueName: \"kubernetes.io/projected/02a27fe0-0637-4911-90ef-32cb592ad9f4-kube-api-access-55jn8\") pod \"node-exporter-dpkzg\" (UID: \"02a27fe0-0637-4911-90ef-32cb592ad9f4\") " pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.410653 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:30.410568 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dpkzg" Apr 20 23:13:30.422224 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:13:30.421959 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02a27fe0_0637_4911_90ef_32cb592ad9f4.slice/crio-cd8cd5994a1e5a180e97ca441b86421a44c32ec5e7cc494c658e11d229a5d91f WatchSource:0}: Error finding container cd8cd5994a1e5a180e97ca441b86421a44c32ec5e7cc494c658e11d229a5d91f: Status 404 returned error can't find the container with id cd8cd5994a1e5a180e97ca441b86421a44c32ec5e7cc494c658e11d229a5d91f Apr 20 23:13:31.055331 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:31.055238 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dpkzg" event={"ID":"02a27fe0-0637-4911-90ef-32cb592ad9f4","Type":"ContainerStarted","Data":"cd8cd5994a1e5a180e97ca441b86421a44c32ec5e7cc494c658e11d229a5d91f"} Apr 20 23:13:32.060347 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:32.060302 2576 generic.go:358] "Generic (PLEG): container finished" podID="02a27fe0-0637-4911-90ef-32cb592ad9f4" containerID="1ed926967d134a6f777c07a33021db252ced16acbbd8b9d7f6bbb6852165a2a1" exitCode=0 Apr 20 23:13:32.060841 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:32.060393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dpkzg" event={"ID":"02a27fe0-0637-4911-90ef-32cb592ad9f4","Type":"ContainerDied","Data":"1ed926967d134a6f777c07a33021db252ced16acbbd8b9d7f6bbb6852165a2a1"} Apr 20 23:13:33.067081 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:33.067030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dpkzg" event={"ID":"02a27fe0-0637-4911-90ef-32cb592ad9f4","Type":"ContainerStarted","Data":"f05682543a8f2d59e36fa8bd18c7bc460356529c8d49a20f369ddde97ed9a708"} Apr 20 23:13:33.067564 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:33.067091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dpkzg" event={"ID":"02a27fe0-0637-4911-90ef-32cb592ad9f4","Type":"ContainerStarted","Data":"3158d0c575f0cae67cae215112601f2faabe0b5d03da8b5bc5b72d3d9e166f13"} Apr 20 23:13:33.119776 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:33.119713 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dpkzg" podStartSLOduration=2.262375003 podStartE2EDuration="3.119696454s" podCreationTimestamp="2026-04-20 23:13:30 +0000 UTC" firstStartedPulling="2026-04-20 23:13:30.423345055 +0000 UTC m=+90.368054486" lastFinishedPulling="2026-04-20 23:13:31.280666504 +0000 UTC m=+91.225375937" observedRunningTime="2026-04-20 23:13:33.117428563 +0000 UTC m=+93.062138037" watchObservedRunningTime="2026-04-20 23:13:33.119696454 +0000 UTC m=+93.064405904" Apr 20 23:13:36.998604 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:36.998560 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c46bdb5c9-xflgd"] Apr 20 23:13:49.115523 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:49.115487 2576 generic.go:358] "Generic (PLEG): container finished" podID="d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e" containerID="cafa95223d7d4e7f82190382bd4eb35d0d7ccfa9e35ca3d1c12da9e11e987797" exitCode=0 Apr 20 23:13:49.115944 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:49.115558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" event={"ID":"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e","Type":"ContainerDied","Data":"cafa95223d7d4e7f82190382bd4eb35d0d7ccfa9e35ca3d1c12da9e11e987797"} Apr 20 23:13:49.115944 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:49.115896 2576 scope.go:117] "RemoveContainer" containerID="cafa95223d7d4e7f82190382bd4eb35d0d7ccfa9e35ca3d1c12da9e11e987797" Apr 20 23:13:50.120603 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:50.120566 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9hxw9" event={"ID":"d5f05570-1fb6-4cf8-b42b-05a9a4e6ac9e","Type":"ContainerStarted","Data":"139e122f7ff886b51d76b63bd268d46806cdda929da437ccdecdbf88165c8204"} Apr 20 23:13:59.147369 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:59.147334 2576 generic.go:358] "Generic (PLEG): container finished" podID="fa5675ad-e501-4f6b-a732-4d1db6454f9a" containerID="bb464a9df8660d01f83ac634b4aa521d7904b48602369ad18edc42bec05a0640" exitCode=0 Apr 20 23:13:59.147747 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:59.147396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" event={"ID":"fa5675ad-e501-4f6b-a732-4d1db6454f9a","Type":"ContainerDied","Data":"bb464a9df8660d01f83ac634b4aa521d7904b48602369ad18edc42bec05a0640"} Apr 20 23:13:59.147747 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:13:59.147699 2576 scope.go:117] "RemoveContainer" containerID="bb464a9df8660d01f83ac634b4aa521d7904b48602369ad18edc42bec05a0640" Apr 20 23:14:00.151809 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:00.151765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xg99l" event={"ID":"fa5675ad-e501-4f6b-a732-4d1db6454f9a","Type":"ContainerStarted","Data":"1c211030283e630208c5b21037934b92a34b5e6e957061de12fc5fe3c74bab5f"} Apr 20 23:14:02.017556 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.017486 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" podUID="e672b75e-249e-4dd5-928c-641987109f81" containerName="registry" containerID="cri-o://ae8872eb233a41f6602c97117dc815506dea70445b5da8fabc28b459ab54c217" gracePeriod=30 Apr 20 23:14:02.163804 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.163756 2576 generic.go:358] "Generic (PLEG): container finished" podID="e672b75e-249e-4dd5-928c-641987109f81" containerID="ae8872eb233a41f6602c97117dc815506dea70445b5da8fabc28b459ab54c217" exitCode=0 Apr 20 23:14:02.163961 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.163829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" event={"ID":"e672b75e-249e-4dd5-928c-641987109f81","Type":"ContainerDied","Data":"ae8872eb233a41f6602c97117dc815506dea70445b5da8fabc28b459ab54c217"} Apr 20 23:14:02.263673 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.263646 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:14:02.393921 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.393885 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-trusted-ca\") pod \"e672b75e-249e-4dd5-928c-641987109f81\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " Apr 20 23:14:02.394117 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.393928 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-installation-pull-secrets\") pod \"e672b75e-249e-4dd5-928c-641987109f81\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " Apr 20 23:14:02.394117 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.393960 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlclr\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-kube-api-access-rlclr\") pod \"e672b75e-249e-4dd5-928c-641987109f81\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " Apr 20 23:14:02.394117 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.393987 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls\") pod \"e672b75e-249e-4dd5-928c-641987109f81\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " Apr 20 23:14:02.394117 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.394025 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-image-registry-private-configuration\") pod \"e672b75e-249e-4dd5-928c-641987109f81\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " Apr 20 23:14:02.394117 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.394074 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e672b75e-249e-4dd5-928c-641987109f81-ca-trust-extracted\") pod \"e672b75e-249e-4dd5-928c-641987109f81\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " Apr 20 23:14:02.394117 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.394104 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-bound-sa-token\") pod \"e672b75e-249e-4dd5-928c-641987109f81\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " Apr 20 23:14:02.394465 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.394130 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-registry-certificates\") pod \"e672b75e-249e-4dd5-928c-641987109f81\" (UID: \"e672b75e-249e-4dd5-928c-641987109f81\") " Apr 20 23:14:02.394465 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.394364 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e672b75e-249e-4dd5-928c-641987109f81" (UID: "e672b75e-249e-4dd5-928c-641987109f81"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:02.394566 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.394469 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-trusted-ca\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:14:02.394687 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.394652 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e672b75e-249e-4dd5-928c-641987109f81" (UID: "e672b75e-249e-4dd5-928c-641987109f81"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:02.396728 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.396685 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e672b75e-249e-4dd5-928c-641987109f81" (UID: "e672b75e-249e-4dd5-928c-641987109f81"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:02.396891 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.396799 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e672b75e-249e-4dd5-928c-641987109f81" (UID: "e672b75e-249e-4dd5-928c-641987109f81"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:02.396956 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.396881 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-kube-api-access-rlclr" (OuterVolumeSpecName: "kube-api-access-rlclr") pod "e672b75e-249e-4dd5-928c-641987109f81" (UID: "e672b75e-249e-4dd5-928c-641987109f81"). InnerVolumeSpecName "kube-api-access-rlclr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:14:02.397010 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.396984 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e672b75e-249e-4dd5-928c-641987109f81" (UID: "e672b75e-249e-4dd5-928c-641987109f81"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:14:02.397063 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.397019 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e672b75e-249e-4dd5-928c-641987109f81" (UID: "e672b75e-249e-4dd5-928c-641987109f81"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:14:02.403917 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.403892 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e672b75e-249e-4dd5-928c-641987109f81-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e672b75e-249e-4dd5-928c-641987109f81" (UID: "e672b75e-249e-4dd5-928c-641987109f81"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:14:02.495623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.495586 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e672b75e-249e-4dd5-928c-641987109f81-ca-trust-extracted\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:14:02.495623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.495621 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-bound-sa-token\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:14:02.495623 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.495631 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e672b75e-249e-4dd5-928c-641987109f81-registry-certificates\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:14:02.495872 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.495641 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-installation-pull-secrets\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:14:02.495872 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.495651 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rlclr\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-kube-api-access-rlclr\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:14:02.495872 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.495660 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e672b75e-249e-4dd5-928c-641987109f81-registry-tls\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:14:02.495872 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:02.495670 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e672b75e-249e-4dd5-928c-641987109f81-image-registry-private-configuration\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:14:03.167908 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:03.167879 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" Apr 20 23:14:03.168363 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:03.167878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c46bdb5c9-xflgd" event={"ID":"e672b75e-249e-4dd5-928c-641987109f81","Type":"ContainerDied","Data":"365a50740812d3dc6462a61245d29cde32a65e6a7965e7777034454d161c4e07"} Apr 20 23:14:03.168363 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:03.168004 2576 scope.go:117] "RemoveContainer" containerID="ae8872eb233a41f6602c97117dc815506dea70445b5da8fabc28b459ab54c217" Apr 20 23:14:03.188832 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:03.188805 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c46bdb5c9-xflgd"] Apr 20 23:14:03.201642 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:03.201607 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6c46bdb5c9-xflgd"] Apr 20 23:14:04.639862 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:14:04.639830 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e672b75e-249e-4dd5-928c-641987109f81" path="/var/lib/kubelet/pods/e672b75e-249e-4dd5-928c-641987109f81/volumes" Apr 20 23:17:00.596922 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:17:00.596889 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:17:00.597923 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:17:00.597889 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:17:00.601718 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:17:00.601694 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 23:18:03.869246 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.869166 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq"] Apr 20 23:18:03.869658 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.869457 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e672b75e-249e-4dd5-928c-641987109f81" containerName="registry" Apr 20 23:18:03.869658 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.869467 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e672b75e-249e-4dd5-928c-641987109f81" containerName="registry" Apr 20 23:18:03.869658 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.869518 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e672b75e-249e-4dd5-928c-641987109f81" containerName="registry" Apr 20 23:18:03.872307 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.872291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:03.886191 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.886172 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 23:18:03.888219 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.888197 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 23:18:03.888327 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.888243 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 23:18:03.888327 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.888255 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 23:18:03.888694 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.888676 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-c9lhx\"" Apr 20 23:18:03.914613 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.914580 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq"] Apr 20 23:18:03.961852 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.961815 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a377e8d5-018f-4c51-8383-d70685804f26-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-cfmkq\" (UID: \"a377e8d5-018f-4c51-8383-d70685804f26\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:03.962008 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.961855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a377e8d5-018f-4c51-8383-d70685804f26-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-cfmkq\" (UID: \"a377e8d5-018f-4c51-8383-d70685804f26\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:03.962008 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:03.961895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr5th\" (UniqueName: \"kubernetes.io/projected/a377e8d5-018f-4c51-8383-d70685804f26-kube-api-access-sr5th\") pod \"opendatahub-operator-controller-manager-5d79c565b7-cfmkq\" (UID: \"a377e8d5-018f-4c51-8383-d70685804f26\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:04.062755 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:04.062722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr5th\" (UniqueName: \"kubernetes.io/projected/a377e8d5-018f-4c51-8383-d70685804f26-kube-api-access-sr5th\") pod \"opendatahub-operator-controller-manager-5d79c565b7-cfmkq\" (UID: \"a377e8d5-018f-4c51-8383-d70685804f26\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:04.062930 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:04.062790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a377e8d5-018f-4c51-8383-d70685804f26-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-cfmkq\" (UID: \"a377e8d5-018f-4c51-8383-d70685804f26\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:04.062930 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:04.062811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a377e8d5-018f-4c51-8383-d70685804f26-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-cfmkq\" (UID: \"a377e8d5-018f-4c51-8383-d70685804f26\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:04.065210 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:04.065186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a377e8d5-018f-4c51-8383-d70685804f26-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-cfmkq\" (UID: \"a377e8d5-018f-4c51-8383-d70685804f26\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:04.065333 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:04.065300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a377e8d5-018f-4c51-8383-d70685804f26-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-cfmkq\" (UID: \"a377e8d5-018f-4c51-8383-d70685804f26\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:04.070533 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:04.070515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr5th\" (UniqueName: \"kubernetes.io/projected/a377e8d5-018f-4c51-8383-d70685804f26-kube-api-access-sr5th\") pod \"opendatahub-operator-controller-manager-5d79c565b7-cfmkq\" (UID: \"a377e8d5-018f-4c51-8383-d70685804f26\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:04.181827 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:04.181735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:04.307376 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:04.307353 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq"] Apr 20 23:18:04.309537 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:18:04.309507 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda377e8d5_018f_4c51_8383_d70685804f26.slice/crio-4a25a7a52b23eadfb81105f7c25e3717b3a9d67f1e3184058ab1c52e8d6b093e WatchSource:0}: Error finding container 4a25a7a52b23eadfb81105f7c25e3717b3a9d67f1e3184058ab1c52e8d6b093e: Status 404 returned error can't find the container with id 4a25a7a52b23eadfb81105f7c25e3717b3a9d67f1e3184058ab1c52e8d6b093e Apr 20 23:18:04.311086 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:04.311068 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 23:18:04.842570 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:04.842532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" event={"ID":"a377e8d5-018f-4c51-8383-d70685804f26","Type":"ContainerStarted","Data":"4a25a7a52b23eadfb81105f7c25e3717b3a9d67f1e3184058ab1c52e8d6b093e"} Apr 20 23:18:07.858867 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:07.858826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" event={"ID":"a377e8d5-018f-4c51-8383-d70685804f26","Type":"ContainerStarted","Data":"5c5cd809a817fbffa6fbbc5e1aff2914c93f7ebb0f18a1982a13a10a259a74ae"} Apr 20 23:18:07.859286 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:07.858945 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:07.884846 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:07.884790 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" podStartSLOduration=2.253895304 podStartE2EDuration="4.884778891s" podCreationTimestamp="2026-04-20 23:18:03 +0000 UTC" firstStartedPulling="2026-04-20 23:18:04.31120141 +0000 UTC m=+364.255910838" lastFinishedPulling="2026-04-20 23:18:06.942084983 +0000 UTC m=+366.886794425" observedRunningTime="2026-04-20 23:18:07.883468774 +0000 UTC m=+367.828178223" watchObservedRunningTime="2026-04-20 23:18:07.884778891 +0000 UTC m=+367.829488342" Apr 20 23:18:18.864576 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:18.864541 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-cfmkq" Apr 20 23:18:26.365565 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:26.365531 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-vx2sj"] Apr 20 23:18:26.372709 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:26.372688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:26.375121 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:26.375085 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-m82rd\"" Apr 20 23:18:26.375282 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:26.375240 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 23:18:26.376106 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:26.376083 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-vx2sj"] Apr 20 23:18:26.443267 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:26.443232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nsrz\" (UniqueName: \"kubernetes.io/projected/12eabfbe-7953-48de-bfb9-65826a0c3846-kube-api-access-6nsrz\") pod \"odh-model-controller-858dbf95b8-vx2sj\" (UID: \"12eabfbe-7953-48de-bfb9-65826a0c3846\") " pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:26.443267 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:26.443269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12eabfbe-7953-48de-bfb9-65826a0c3846-cert\") pod \"odh-model-controller-858dbf95b8-vx2sj\" (UID: \"12eabfbe-7953-48de-bfb9-65826a0c3846\") " pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:26.544710 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:26.544676 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nsrz\" (UniqueName: \"kubernetes.io/projected/12eabfbe-7953-48de-bfb9-65826a0c3846-kube-api-access-6nsrz\") pod \"odh-model-controller-858dbf95b8-vx2sj\" (UID: \"12eabfbe-7953-48de-bfb9-65826a0c3846\") " pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:26.544710 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:26.544716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12eabfbe-7953-48de-bfb9-65826a0c3846-cert\") pod \"odh-model-controller-858dbf95b8-vx2sj\" (UID: \"12eabfbe-7953-48de-bfb9-65826a0c3846\") " pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:26.544919 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:18:26.544831 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 23:18:26.544919 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:18:26.544886 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12eabfbe-7953-48de-bfb9-65826a0c3846-cert podName:12eabfbe-7953-48de-bfb9-65826a0c3846 nodeName:}" failed. No retries permitted until 2026-04-20 23:18:27.044869099 +0000 UTC m=+386.989578529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/12eabfbe-7953-48de-bfb9-65826a0c3846-cert") pod "odh-model-controller-858dbf95b8-vx2sj" (UID: "12eabfbe-7953-48de-bfb9-65826a0c3846") : secret "odh-model-controller-webhook-cert" not found Apr 20 23:18:26.553561 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:26.553522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nsrz\" (UniqueName: \"kubernetes.io/projected/12eabfbe-7953-48de-bfb9-65826a0c3846-kube-api-access-6nsrz\") pod \"odh-model-controller-858dbf95b8-vx2sj\" (UID: \"12eabfbe-7953-48de-bfb9-65826a0c3846\") " pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:27.048371 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:27.048339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12eabfbe-7953-48de-bfb9-65826a0c3846-cert\") pod \"odh-model-controller-858dbf95b8-vx2sj\" (UID: \"12eabfbe-7953-48de-bfb9-65826a0c3846\") " pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:27.050653 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:27.050624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12eabfbe-7953-48de-bfb9-65826a0c3846-cert\") pod \"odh-model-controller-858dbf95b8-vx2sj\" (UID: \"12eabfbe-7953-48de-bfb9-65826a0c3846\") " pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:27.284718 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:27.284682 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:27.422688 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:27.422655 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-vx2sj"] Apr 20 23:18:27.426414 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:18:27.426378 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12eabfbe_7953_48de_bfb9_65826a0c3846.slice/crio-4c3f589c30d2a1f6616bc625f9860f824e88e9ec9205abd0e70a1c57e6b816ae WatchSource:0}: Error finding container 4c3f589c30d2a1f6616bc625f9860f824e88e9ec9205abd0e70a1c57e6b816ae: Status 404 returned error can't find the container with id 4c3f589c30d2a1f6616bc625f9860f824e88e9ec9205abd0e70a1c57e6b816ae Apr 20 23:18:27.921821 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:27.921781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" event={"ID":"12eabfbe-7953-48de-bfb9-65826a0c3846","Type":"ContainerStarted","Data":"4c3f589c30d2a1f6616bc625f9860f824e88e9ec9205abd0e70a1c57e6b816ae"} Apr 20 23:18:30.935634 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:30.935596 2576 generic.go:358] "Generic (PLEG): container finished" podID="12eabfbe-7953-48de-bfb9-65826a0c3846" containerID="54f7e50a413677c687346341f8dbad3f0a87f947a548178437636a5145c1135d" exitCode=1 Apr 20 23:18:30.936094 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:30.935659 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" event={"ID":"12eabfbe-7953-48de-bfb9-65826a0c3846","Type":"ContainerDied","Data":"54f7e50a413677c687346341f8dbad3f0a87f947a548178437636a5145c1135d"} Apr 20 23:18:30.936094 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:30.935870 2576 scope.go:117] "RemoveContainer" containerID="54f7e50a413677c687346341f8dbad3f0a87f947a548178437636a5145c1135d" Apr 20 23:18:31.662311 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.662273 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-d85d4"] Apr 20 23:18:31.665386 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.665364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" Apr 20 23:18:31.669110 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.669089 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-lbqgb\"" Apr 20 23:18:31.669534 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.669503 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 23:18:31.691671 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.691644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7540190a-b508-4374-afff-643e7631a11f-cert\") pod \"kserve-controller-manager-856948b99f-d85d4\" (UID: \"7540190a-b508-4374-afff-643e7631a11f\") " pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" Apr 20 23:18:31.691839 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.691806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-424t7\" (UniqueName: \"kubernetes.io/projected/7540190a-b508-4374-afff-643e7631a11f-kube-api-access-424t7\") pod \"kserve-controller-manager-856948b99f-d85d4\" (UID: \"7540190a-b508-4374-afff-643e7631a11f\") " pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" Apr 20 23:18:31.713561 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.713534 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-d85d4"] Apr 20 23:18:31.793222 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.793185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-424t7\" (UniqueName: \"kubernetes.io/projected/7540190a-b508-4374-afff-643e7631a11f-kube-api-access-424t7\") pod \"kserve-controller-manager-856948b99f-d85d4\" (UID: \"7540190a-b508-4374-afff-643e7631a11f\") " pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" Apr 20 23:18:31.793379 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.793242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7540190a-b508-4374-afff-643e7631a11f-cert\") pod \"kserve-controller-manager-856948b99f-d85d4\" (UID: \"7540190a-b508-4374-afff-643e7631a11f\") " pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" Apr 20 23:18:31.793379 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:18:31.793333 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 23:18:31.793465 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:18:31.793388 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7540190a-b508-4374-afff-643e7631a11f-cert podName:7540190a-b508-4374-afff-643e7631a11f nodeName:}" failed. No retries permitted until 2026-04-20 23:18:32.29337167 +0000 UTC m=+392.238081101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7540190a-b508-4374-afff-643e7631a11f-cert") pod "kserve-controller-manager-856948b99f-d85d4" (UID: "7540190a-b508-4374-afff-643e7631a11f") : secret "kserve-webhook-server-cert" not found Apr 20 23:18:31.804425 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.804401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-424t7\" (UniqueName: \"kubernetes.io/projected/7540190a-b508-4374-afff-643e7631a11f-kube-api-access-424t7\") pod \"kserve-controller-manager-856948b99f-d85d4\" (UID: \"7540190a-b508-4374-afff-643e7631a11f\") " pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" Apr 20 23:18:31.940955 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.940864 2576 generic.go:358] "Generic (PLEG): container finished" podID="12eabfbe-7953-48de-bfb9-65826a0c3846" containerID="ca2d92af39da57438b042c90f41ccb97472853bfc84150f85303d26e4bb65df0" exitCode=1 Apr 20 23:18:31.941428 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.940953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" event={"ID":"12eabfbe-7953-48de-bfb9-65826a0c3846","Type":"ContainerDied","Data":"ca2d92af39da57438b042c90f41ccb97472853bfc84150f85303d26e4bb65df0"} Apr 20 23:18:31.941428 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.941002 2576 scope.go:117] "RemoveContainer" containerID="54f7e50a413677c687346341f8dbad3f0a87f947a548178437636a5145c1135d" Apr 20 23:18:31.941428 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:31.941179 2576 scope.go:117] "RemoveContainer" containerID="ca2d92af39da57438b042c90f41ccb97472853bfc84150f85303d26e4bb65df0" Apr 20 23:18:31.941428 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:18:31.941383 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-vx2sj_opendatahub(12eabfbe-7953-48de-bfb9-65826a0c3846)\"" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" podUID="12eabfbe-7953-48de-bfb9-65826a0c3846" Apr 20 23:18:32.296527 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:32.296429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7540190a-b508-4374-afff-643e7631a11f-cert\") pod \"kserve-controller-manager-856948b99f-d85d4\" (UID: \"7540190a-b508-4374-afff-643e7631a11f\") " pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" Apr 20 23:18:32.298778 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:32.298756 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7540190a-b508-4374-afff-643e7631a11f-cert\") pod \"kserve-controller-manager-856948b99f-d85d4\" (UID: \"7540190a-b508-4374-afff-643e7631a11f\") " pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" Apr 20 23:18:32.575463 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:32.575381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" Apr 20 23:18:32.694316 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:32.694218 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-d85d4"] Apr 20 23:18:32.696903 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:18:32.696874 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7540190a_b508_4374_afff_643e7631a11f.slice/crio-e9a0573337b32ad1cd3997c8b02e5373b2369c8c5350c04ded6088e1510ccf45 WatchSource:0}: Error finding container e9a0573337b32ad1cd3997c8b02e5373b2369c8c5350c04ded6088e1510ccf45: Status 404 returned error can't find the container with id e9a0573337b32ad1cd3997c8b02e5373b2369c8c5350c04ded6088e1510ccf45 Apr 20 23:18:32.946360 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:32.946330 2576 scope.go:117] "RemoveContainer" containerID="ca2d92af39da57438b042c90f41ccb97472853bfc84150f85303d26e4bb65df0" Apr 20 23:18:32.946796 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:18:32.946521 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-vx2sj_opendatahub(12eabfbe-7953-48de-bfb9-65826a0c3846)\"" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" podUID="12eabfbe-7953-48de-bfb9-65826a0c3846" Apr 20 23:18:32.947113 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:32.947091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" event={"ID":"7540190a-b508-4374-afff-643e7631a11f","Type":"ContainerStarted","Data":"e9a0573337b32ad1cd3997c8b02e5373b2369c8c5350c04ded6088e1510ccf45"} Apr 20 23:18:33.570075 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.570035 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4"] Apr 20 23:18:33.573890 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.573867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" Apr 20 23:18:33.576082 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.576051 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 23:18:33.576212 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.576114 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 23:18:33.581158 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.581115 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4"] Apr 20 23:18:33.606784 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.606756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79ab33d4-b8d4-4741-9be4-b52e8a781c2f-tls-certs\") pod \"kube-auth-proxy-74c79b5d98-tlqs4\" (UID: \"79ab33d4-b8d4-4741-9be4-b52e8a781c2f\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" Apr 20 23:18:33.606929 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.606838 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmqvc\" (UniqueName: \"kubernetes.io/projected/79ab33d4-b8d4-4741-9be4-b52e8a781c2f-kube-api-access-fmqvc\") pod \"kube-auth-proxy-74c79b5d98-tlqs4\" (UID: \"79ab33d4-b8d4-4741-9be4-b52e8a781c2f\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" Apr 20 23:18:33.606991 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.606949 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79ab33d4-b8d4-4741-9be4-b52e8a781c2f-tmp\") pod \"kube-auth-proxy-74c79b5d98-tlqs4\" (UID: \"79ab33d4-b8d4-4741-9be4-b52e8a781c2f\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" Apr 20 23:18:33.708487 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.708451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79ab33d4-b8d4-4741-9be4-b52e8a781c2f-tmp\") pod \"kube-auth-proxy-74c79b5d98-tlqs4\" (UID: \"79ab33d4-b8d4-4741-9be4-b52e8a781c2f\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" Apr 20 23:18:33.708655 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.708519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79ab33d4-b8d4-4741-9be4-b52e8a781c2f-tls-certs\") pod \"kube-auth-proxy-74c79b5d98-tlqs4\" (UID: \"79ab33d4-b8d4-4741-9be4-b52e8a781c2f\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" Apr 20 23:18:33.708712 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.708692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqvc\" (UniqueName: \"kubernetes.io/projected/79ab33d4-b8d4-4741-9be4-b52e8a781c2f-kube-api-access-fmqvc\") pod \"kube-auth-proxy-74c79b5d98-tlqs4\" (UID: \"79ab33d4-b8d4-4741-9be4-b52e8a781c2f\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" Apr 20 23:18:33.711171 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.711127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79ab33d4-b8d4-4741-9be4-b52e8a781c2f-tmp\") pod \"kube-auth-proxy-74c79b5d98-tlqs4\" (UID: \"79ab33d4-b8d4-4741-9be4-b52e8a781c2f\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" Apr 20 23:18:33.711517 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.711494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79ab33d4-b8d4-4741-9be4-b52e8a781c2f-tls-certs\") pod \"kube-auth-proxy-74c79b5d98-tlqs4\" (UID: \"79ab33d4-b8d4-4741-9be4-b52e8a781c2f\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" Apr 20 23:18:33.718011 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.717989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmqvc\" (UniqueName: \"kubernetes.io/projected/79ab33d4-b8d4-4741-9be4-b52e8a781c2f-kube-api-access-fmqvc\") pod \"kube-auth-proxy-74c79b5d98-tlqs4\" (UID: \"79ab33d4-b8d4-4741-9be4-b52e8a781c2f\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" Apr 20 23:18:33.886521 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:33.886482 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" Apr 20 23:18:34.011918 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:34.011864 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4"] Apr 20 23:18:34.015835 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:18:34.015806 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ab33d4_b8d4_4741_9be4_b52e8a781c2f.slice/crio-67071628fa96a58bb402a8cb9f8120a5d3fa43fb5fb5f6e380d9e8b02d7daf13 WatchSource:0}: Error finding container 67071628fa96a58bb402a8cb9f8120a5d3fa43fb5fb5f6e380d9e8b02d7daf13: Status 404 returned error can't find the container with id 67071628fa96a58bb402a8cb9f8120a5d3fa43fb5fb5f6e380d9e8b02d7daf13 Apr 20 23:18:34.955039 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:34.954998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" event={"ID":"79ab33d4-b8d4-4741-9be4-b52e8a781c2f","Type":"ContainerStarted","Data":"67071628fa96a58bb402a8cb9f8120a5d3fa43fb5fb5f6e380d9e8b02d7daf13"} Apr 20 23:18:36.964343 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:36.964307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" event={"ID":"7540190a-b508-4374-afff-643e7631a11f","Type":"ContainerStarted","Data":"c8be20e02580712bb3fabbd5eb2e9bdf3da3a5feaeb23f49cc6f33adf2bae6c2"} Apr 20 23:18:36.964343 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:36.964360 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" Apr 20 23:18:36.981954 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:36.981894 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" podStartSLOduration=2.439801609 podStartE2EDuration="5.981874346s" podCreationTimestamp="2026-04-20 23:18:31 +0000 UTC" firstStartedPulling="2026-04-20 23:18:32.698222026 +0000 UTC m=+392.642931455" lastFinishedPulling="2026-04-20 23:18:36.240294751 +0000 UTC m=+396.185004192" observedRunningTime="2026-04-20 23:18:36.97928558 +0000 UTC m=+396.923995032" watchObservedRunningTime="2026-04-20 23:18:36.981874346 +0000 UTC m=+396.926583799" Apr 20 23:18:37.285488 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:37.285400 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:37.285879 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:37.285862 2576 scope.go:117] "RemoveContainer" containerID="ca2d92af39da57438b042c90f41ccb97472853bfc84150f85303d26e4bb65df0" Apr 20 23:18:37.286110 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:18:37.286089 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-vx2sj_opendatahub(12eabfbe-7953-48de-bfb9-65826a0c3846)\"" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" podUID="12eabfbe-7953-48de-bfb9-65826a0c3846" Apr 20 23:18:37.968986 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:37.968950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" event={"ID":"79ab33d4-b8d4-4741-9be4-b52e8a781c2f","Type":"ContainerStarted","Data":"4bfb5063cd6e2744112050bf2bab6d3a26fbb778ecf9bce2efd3381320267d99"} Apr 20 23:18:37.985217 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:37.985171 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-74c79b5d98-tlqs4" podStartSLOduration=1.558239231 podStartE2EDuration="4.985155418s" podCreationTimestamp="2026-04-20 23:18:33 +0000 UTC" firstStartedPulling="2026-04-20 23:18:34.017899257 +0000 UTC m=+393.962608692" lastFinishedPulling="2026-04-20 23:18:37.444815436 +0000 UTC m=+397.389524879" observedRunningTime="2026-04-20 23:18:37.983664055 +0000 UTC m=+397.928373506" watchObservedRunningTime="2026-04-20 23:18:37.985155418 +0000 UTC m=+397.929864867" Apr 20 23:18:47.285755 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:47.285717 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:47.286176 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:47.286132 2576 scope.go:117] "RemoveContainer" containerID="ca2d92af39da57438b042c90f41ccb97472853bfc84150f85303d26e4bb65df0" Apr 20 23:18:48.000804 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:48.000770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" event={"ID":"12eabfbe-7953-48de-bfb9-65826a0c3846","Type":"ContainerStarted","Data":"a0974a2ccd827322ee641786d30d4658037348317c6c670b31eed25850c9a507"} Apr 20 23:18:48.000991 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:48.000963 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:18:48.952232 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:48.952174 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" podStartSLOduration=2.816527938 podStartE2EDuration="22.952130201s" podCreationTimestamp="2026-04-20 23:18:26 +0000 UTC" firstStartedPulling="2026-04-20 23:18:27.427663169 +0000 UTC m=+387.372372602" lastFinishedPulling="2026-04-20 23:18:47.563265423 +0000 UTC m=+407.507974865" observedRunningTime="2026-04-20 23:18:48.024117979 +0000 UTC m=+407.968827433" watchObservedRunningTime="2026-04-20 23:18:48.952130201 +0000 UTC m=+408.896839653" Apr 20 23:18:48.953319 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:48.953299 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-skqcv"] Apr 20 23:18:48.956695 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:48.956680 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" Apr 20 23:18:48.959386 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:48.959364 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 23:18:48.959692 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:48.959670 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 23:18:48.959764 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:48.959722 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-wn2wm\"" Apr 20 23:18:48.968569 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:48.968548 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-skqcv"] Apr 20 23:18:49.041925 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:49.041899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a721baf1-655b-4551-bfe9-9d96f8d96558-operator-config\") pod \"servicemesh-operator3-55f49c5f94-skqcv\" (UID: \"a721baf1-655b-4551-bfe9-9d96f8d96558\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" Apr 20 23:18:49.042072 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:49.041950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfjx2\" (UniqueName: \"kubernetes.io/projected/a721baf1-655b-4551-bfe9-9d96f8d96558-kube-api-access-gfjx2\") pod \"servicemesh-operator3-55f49c5f94-skqcv\" (UID: \"a721baf1-655b-4551-bfe9-9d96f8d96558\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" Apr 20 23:18:49.142991 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:49.142958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfjx2\" (UniqueName: \"kubernetes.io/projected/a721baf1-655b-4551-bfe9-9d96f8d96558-kube-api-access-gfjx2\") pod \"servicemesh-operator3-55f49c5f94-skqcv\" (UID: \"a721baf1-655b-4551-bfe9-9d96f8d96558\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" Apr 20 23:18:49.143192 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:49.143099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a721baf1-655b-4551-bfe9-9d96f8d96558-operator-config\") pod \"servicemesh-operator3-55f49c5f94-skqcv\" (UID: \"a721baf1-655b-4551-bfe9-9d96f8d96558\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" Apr 20 23:18:49.145996 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:49.145964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a721baf1-655b-4551-bfe9-9d96f8d96558-operator-config\") pod \"servicemesh-operator3-55f49c5f94-skqcv\" (UID: \"a721baf1-655b-4551-bfe9-9d96f8d96558\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" Apr 20 23:18:49.151334 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:49.151307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfjx2\" (UniqueName: \"kubernetes.io/projected/a721baf1-655b-4551-bfe9-9d96f8d96558-kube-api-access-gfjx2\") pod \"servicemesh-operator3-55f49c5f94-skqcv\" (UID: \"a721baf1-655b-4551-bfe9-9d96f8d96558\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" Apr 20 23:18:49.265540 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:49.265450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" Apr 20 23:18:49.409151 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:49.409105 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-skqcv"] Apr 20 23:18:49.412185 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:18:49.412151 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda721baf1_655b_4551_bfe9_9d96f8d96558.slice/crio-4a1f2261c6bd310592f734a4a461ffe4aedbb59a319e825e7ba233975c6c9105 WatchSource:0}: Error finding container 4a1f2261c6bd310592f734a4a461ffe4aedbb59a319e825e7ba233975c6c9105: Status 404 returned error can't find the container with id 4a1f2261c6bd310592f734a4a461ffe4aedbb59a319e825e7ba233975c6c9105 Apr 20 23:18:50.008493 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:50.008462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" event={"ID":"a721baf1-655b-4551-bfe9-9d96f8d96558","Type":"ContainerStarted","Data":"4a1f2261c6bd310592f734a4a461ffe4aedbb59a319e825e7ba233975c6c9105"} Apr 20 23:18:53.021035 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:53.020998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" event={"ID":"a721baf1-655b-4551-bfe9-9d96f8d96558","Type":"ContainerStarted","Data":"67b72555225e18dff9171d79225eb4fd84dfd7e6ecf01c3c1025858ae4a9c2da"} Apr 20 23:18:53.021484 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:53.021057 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" Apr 20 23:18:53.043113 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:53.043060 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" podStartSLOduration=1.94607888 podStartE2EDuration="5.043042218s" podCreationTimestamp="2026-04-20 23:18:48 +0000 UTC" firstStartedPulling="2026-04-20 23:18:49.414610647 +0000 UTC m=+409.359320080" lastFinishedPulling="2026-04-20 23:18:52.511573987 +0000 UTC m=+412.456283418" observedRunningTime="2026-04-20 23:18:53.042520329 +0000 UTC m=+412.987229778" watchObservedRunningTime="2026-04-20 23:18:53.043042218 +0000 UTC m=+412.987751670" Apr 20 23:18:59.006116 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:18:59.006083 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-vx2sj" Apr 20 23:19:04.027186 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:04.027155 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-skqcv" Apr 20 23:19:05.370874 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.370835 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq"] Apr 20 23:19:05.378599 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.378572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.382461 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.382429 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-nzvxk\"" Apr 20 23:19:05.382461 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.382460 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 23:19:05.382680 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.382531 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 23:19:05.382791 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.382775 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 23:19:05.383285 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.383271 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 23:19:05.393198 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.393176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq"] Apr 20 23:19:05.482392 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.482349 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4d45\" (UniqueName: \"kubernetes.io/projected/4d7994aa-5e1d-4085-bed6-389da1901c3c-kube-api-access-s4d45\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.482392 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.482394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4d7994aa-5e1d-4085-bed6-389da1901c3c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.482620 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.482431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4d7994aa-5e1d-4085-bed6-389da1901c3c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.482620 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.482449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.482620 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.482470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.482620 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.482514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.482620 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.482598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.583652 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.583606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.583652 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.583647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4d45\" (UniqueName: \"kubernetes.io/projected/4d7994aa-5e1d-4085-bed6-389da1901c3c-kube-api-access-s4d45\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.583910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.583669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4d7994aa-5e1d-4085-bed6-389da1901c3c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.583910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.583709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4d7994aa-5e1d-4085-bed6-389da1901c3c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.583910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.583725 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.583910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.583749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.583910 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.583801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.584472 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.584448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.586610 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.586585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4d7994aa-5e1d-4085-bed6-389da1901c3c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.586712 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.586667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4d7994aa-5e1d-4085-bed6-389da1901c3c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.586798 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.586778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.586902 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.586886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.597121 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.597092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4d7994aa-5e1d-4085-bed6-389da1901c3c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.603383 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.603359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4d45\" (UniqueName: \"kubernetes.io/projected/4d7994aa-5e1d-4085-bed6-389da1901c3c-kube-api-access-s4d45\") pod \"istiod-openshift-gateway-55ff986f96-rhjnq\" (UID: \"4d7994aa-5e1d-4085-bed6-389da1901c3c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.688587 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.688484 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:05.823343 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:05.823301 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq"] Apr 20 23:19:05.825816 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:19:05.825786 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d7994aa_5e1d_4085_bed6_389da1901c3c.slice/crio-4c5ef69bf57f270d22050a3b2070a6c22aa05dfabeb493b8d6a8711fede63d04 WatchSource:0}: Error finding container 4c5ef69bf57f270d22050a3b2070a6c22aa05dfabeb493b8d6a8711fede63d04: Status 404 returned error can't find the container with id 4c5ef69bf57f270d22050a3b2070a6c22aa05dfabeb493b8d6a8711fede63d04 Apr 20 23:19:06.066572 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:06.066483 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" event={"ID":"4d7994aa-5e1d-4085-bed6-389da1901c3c","Type":"ContainerStarted","Data":"4c5ef69bf57f270d22050a3b2070a6c22aa05dfabeb493b8d6a8711fede63d04"} Apr 20 23:19:07.975436 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:07.975401 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-d85d4" Apr 20 23:19:08.741258 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:08.741211 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 23:19:08.741376 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:08.741326 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 23:19:09.080061 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:09.078969 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" event={"ID":"4d7994aa-5e1d-4085-bed6-389da1901c3c","Type":"ContainerStarted","Data":"e82ac228a1809a8260d8d7b339f949751f84e080ce8759827d5665d4fb826403"} Apr 20 23:19:09.080061 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:09.079957 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:19:09.081763 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:09.081738 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-rhjnq container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 23:19:09.081883 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:09.081782 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" podUID="4d7994aa-5e1d-4085-bed6-389da1901c3c" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 23:19:09.103805 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:09.103762 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" podStartSLOduration=1.190445382 podStartE2EDuration="4.103751097s" podCreationTimestamp="2026-04-20 23:19:05 +0000 UTC" firstStartedPulling="2026-04-20 23:19:05.82761755 +0000 UTC m=+425.772326978" lastFinishedPulling="2026-04-20 23:19:08.740923251 +0000 UTC m=+428.685632693" observedRunningTime="2026-04-20 23:19:09.102389518 +0000 UTC m=+429.047098970" watchObservedRunningTime="2026-04-20 23:19:09.103751097 +0000 UTC m=+429.048460548" Apr 20 23:19:10.084658 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:19:10.084625 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rhjnq" Apr 20 23:20:02.926741 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:02.926704 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-b2dp8"] Apr 20 23:20:02.930158 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:02.930126 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-b2dp8" Apr 20 23:20:02.932686 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:02.932661 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 23:20:02.932775 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:02.932713 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-h7dds\"" Apr 20 23:20:02.933635 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:02.933621 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 23:20:02.946889 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:02.946868 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-b2dp8"] Apr 20 23:20:03.084968 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:03.084930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcdln\" (UniqueName: \"kubernetes.io/projected/ae6ac25b-f6ca-4112-a0ad-89e1521d85b9-kube-api-access-jcdln\") pod \"authorino-operator-657f44b778-b2dp8\" (UID: \"ae6ac25b-f6ca-4112-a0ad-89e1521d85b9\") " pod="kuadrant-system/authorino-operator-657f44b778-b2dp8" Apr 20 23:20:03.186353 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:03.186272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcdln\" (UniqueName: \"kubernetes.io/projected/ae6ac25b-f6ca-4112-a0ad-89e1521d85b9-kube-api-access-jcdln\") pod \"authorino-operator-657f44b778-b2dp8\" (UID: \"ae6ac25b-f6ca-4112-a0ad-89e1521d85b9\") " pod="kuadrant-system/authorino-operator-657f44b778-b2dp8" Apr 20 23:20:03.197608 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:03.197577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcdln\" (UniqueName: \"kubernetes.io/projected/ae6ac25b-f6ca-4112-a0ad-89e1521d85b9-kube-api-access-jcdln\") pod \"authorino-operator-657f44b778-b2dp8\" (UID: \"ae6ac25b-f6ca-4112-a0ad-89e1521d85b9\") " pod="kuadrant-system/authorino-operator-657f44b778-b2dp8" Apr 20 23:20:03.241077 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:03.241054 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-b2dp8" Apr 20 23:20:03.574189 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:03.574119 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-b2dp8"] Apr 20 23:20:03.577171 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:20:03.577125 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae6ac25b_f6ca_4112_a0ad_89e1521d85b9.slice/crio-74fdd0c8856b9eb35a3a2509e1dbb60b5cddb3a86eb81e5640a5acef68da9927 WatchSource:0}: Error finding container 74fdd0c8856b9eb35a3a2509e1dbb60b5cddb3a86eb81e5640a5acef68da9927: Status 404 returned error can't find the container with id 74fdd0c8856b9eb35a3a2509e1dbb60b5cddb3a86eb81e5640a5acef68da9927 Apr 20 23:20:04.259420 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:04.259383 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-b2dp8" event={"ID":"ae6ac25b-f6ca-4112-a0ad-89e1521d85b9","Type":"ContainerStarted","Data":"74fdd0c8856b9eb35a3a2509e1dbb60b5cddb3a86eb81e5640a5acef68da9927"} Apr 20 23:20:06.266677 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:06.266643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-b2dp8" event={"ID":"ae6ac25b-f6ca-4112-a0ad-89e1521d85b9","Type":"ContainerStarted","Data":"496645b7d0bce9df52a72215c35c00bccaf73e605f920133991fe5c9844ac00b"} Apr 20 23:20:06.267099 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:06.266728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-b2dp8" Apr 20 23:20:06.283131 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:06.283088 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-b2dp8" podStartSLOduration=2.391238865 podStartE2EDuration="4.283074861s" podCreationTimestamp="2026-04-20 23:20:02 +0000 UTC" firstStartedPulling="2026-04-20 23:20:03.579614575 +0000 UTC m=+483.524324007" lastFinishedPulling="2026-04-20 23:20:05.47145056 +0000 UTC m=+485.416160003" observedRunningTime="2026-04-20 23:20:06.281047689 +0000 UTC m=+486.225757139" watchObservedRunningTime="2026-04-20 23:20:06.283074861 +0000 UTC m=+486.227784311" Apr 20 23:20:17.273315 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:17.273280 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-b2dp8" Apr 20 23:20:19.285067 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.285031 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8"] Apr 20 23:20:19.290938 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.290920 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" Apr 20 23:20:19.293621 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.293602 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ws4dn\"" Apr 20 23:20:19.300249 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.300229 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8"] Apr 20 23:20:19.309248 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.309224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgrt9\" (UniqueName: \"kubernetes.io/projected/6ea38dfe-6703-470e-9164-c5c7d99f6587-kube-api-access-vgrt9\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cnjr8\" (UID: \"6ea38dfe-6703-470e-9164-c5c7d99f6587\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" Apr 20 23:20:19.309332 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.309277 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6ea38dfe-6703-470e-9164-c5c7d99f6587-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cnjr8\" (UID: \"6ea38dfe-6703-470e-9164-c5c7d99f6587\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" Apr 20 23:20:19.337273 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.337243 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8"] Apr 20 23:20:19.337465 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:20:19.337449 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-vgrt9], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" podUID="6ea38dfe-6703-470e-9164-c5c7d99f6587" Apr 20 23:20:19.341526 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.341498 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8"] Apr 20 23:20:19.410716 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.410684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgrt9\" (UniqueName: \"kubernetes.io/projected/6ea38dfe-6703-470e-9164-c5c7d99f6587-kube-api-access-vgrt9\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cnjr8\" (UID: \"6ea38dfe-6703-470e-9164-c5c7d99f6587\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" Apr 20 23:20:19.410942 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.410739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6ea38dfe-6703-470e-9164-c5c7d99f6587-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cnjr8\" (UID: \"6ea38dfe-6703-470e-9164-c5c7d99f6587\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" Apr 20 23:20:19.411199 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.411160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6ea38dfe-6703-470e-9164-c5c7d99f6587-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cnjr8\" (UID: \"6ea38dfe-6703-470e-9164-c5c7d99f6587\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" Apr 20 23:20:19.413414 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:20:19.413394 2576 projected.go:194] Error preparing data for projected volume kube-api-access-vgrt9 for pod kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8: failed to fetch token: serviceaccounts "kuadrant-operator-controller-manager" is forbidden: User "system:node:ip-10-0-137-139.ec2.internal" cannot create resource "serviceaccounts/token" in API group "" in the namespace "kuadrant-system": no relationship found between node 'ip-10-0-137-139.ec2.internal' and this object Apr 20 23:20:19.413503 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:20:19.413467 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ea38dfe-6703-470e-9164-c5c7d99f6587-kube-api-access-vgrt9 podName:6ea38dfe-6703-470e-9164-c5c7d99f6587 nodeName:}" failed. No retries permitted until 2026-04-20 23:20:19.913448453 +0000 UTC m=+499.858157882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vgrt9" (UniqueName: "kubernetes.io/projected/6ea38dfe-6703-470e-9164-c5c7d99f6587-kube-api-access-vgrt9") pod "kuadrant-operator-controller-manager-55c7f4c975-cnjr8" (UID: "6ea38dfe-6703-470e-9164-c5c7d99f6587") : failed to fetch token: serviceaccounts "kuadrant-operator-controller-manager" is forbidden: User "system:node:ip-10-0-137-139.ec2.internal" cannot create resource "serviceaccounts/token" in API group "" in the namespace "kuadrant-system": no relationship found between node 'ip-10-0-137-139.ec2.internal' and this object Apr 20 23:20:19.913902 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:19.913863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgrt9\" (UniqueName: \"kubernetes.io/projected/6ea38dfe-6703-470e-9164-c5c7d99f6587-kube-api-access-vgrt9\") pod \"kuadrant-operator-controller-manager-55c7f4c975-cnjr8\" (UID: \"6ea38dfe-6703-470e-9164-c5c7d99f6587\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" Apr 20 23:20:19.916696 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:20:19.916674 2576 projected.go:194] Error preparing data for projected volume kube-api-access-vgrt9 for pod kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8: failed to fetch token: serviceaccounts "kuadrant-operator-controller-manager" is forbidden: User "system:node:ip-10-0-137-139.ec2.internal" cannot create resource "serviceaccounts/token" in API group "" in the namespace "kuadrant-system": no relationship found between node 'ip-10-0-137-139.ec2.internal' and this object Apr 20 23:20:19.916786 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:20:19.916753 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ea38dfe-6703-470e-9164-c5c7d99f6587-kube-api-access-vgrt9 podName:6ea38dfe-6703-470e-9164-c5c7d99f6587 nodeName:}" failed. No retries permitted until 2026-04-20 23:20:20.916735914 +0000 UTC m=+500.861445343 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vgrt9" (UniqueName: "kubernetes.io/projected/6ea38dfe-6703-470e-9164-c5c7d99f6587-kube-api-access-vgrt9") pod "kuadrant-operator-controller-manager-55c7f4c975-cnjr8" (UID: "6ea38dfe-6703-470e-9164-c5c7d99f6587") : failed to fetch token: serviceaccounts "kuadrant-operator-controller-manager" is forbidden: User "system:node:ip-10-0-137-139.ec2.internal" cannot create resource "serviceaccounts/token" in API group "" in the namespace "kuadrant-system": no relationship found between node 'ip-10-0-137-139.ec2.internal' and this object Apr 20 23:20:20.314389 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:20.314315 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" Apr 20 23:20:20.316600 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:20.316570 2576 status_manager.go:895] "Failed to get status for pod" podUID="6ea38dfe-6703-470e-9164-c5c7d99f6587" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-cnjr8\" is forbidden: User \"system:node:ip-10-0-137-139.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-139.ec2.internal' and this object" Apr 20 23:20:20.319424 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:20.319406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" Apr 20 23:20:20.321650 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:20.321618 2576 status_manager.go:895] "Failed to get status for pod" podUID="6ea38dfe-6703-470e-9164-c5c7d99f6587" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-cnjr8\" is forbidden: User \"system:node:ip-10-0-137-139.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-139.ec2.internal' and this object" Apr 20 23:20:20.417919 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:20.417886 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6ea38dfe-6703-470e-9164-c5c7d99f6587-extensions-socket-volume\") pod \"6ea38dfe-6703-470e-9164-c5c7d99f6587\" (UID: \"6ea38dfe-6703-470e-9164-c5c7d99f6587\") " Apr 20 23:20:20.418073 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:20.418030 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vgrt9\" (UniqueName: \"kubernetes.io/projected/6ea38dfe-6703-470e-9164-c5c7d99f6587-kube-api-access-vgrt9\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:20:20.418354 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:20.418322 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea38dfe-6703-470e-9164-c5c7d99f6587-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "6ea38dfe-6703-470e-9164-c5c7d99f6587" (UID: "6ea38dfe-6703-470e-9164-c5c7d99f6587"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:20:20.519106 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:20.519069 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6ea38dfe-6703-470e-9164-c5c7d99f6587-extensions-socket-volume\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:20:20.639895 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:20.639861 2576 status_manager.go:895] "Failed to get status for pod" podUID="6ea38dfe-6703-470e-9164-c5c7d99f6587" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-cnjr8\" is forbidden: User \"system:node:ip-10-0-137-139.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-139.ec2.internal' and this object" Apr 20 23:20:20.640794 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:20.640773 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea38dfe-6703-470e-9164-c5c7d99f6587" path="/var/lib/kubelet/pods/6ea38dfe-6703-470e-9164-c5c7d99f6587/volumes" Apr 20 23:20:21.317174 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:21.317128 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" Apr 20 23:20:21.321758 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:20:21.321731 2576 status_manager.go:895] "Failed to get status for pod" podUID="6ea38dfe-6703-470e-9164-c5c7d99f6587" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-cnjr8" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-cnjr8\" is forbidden: User \"system:node:ip-10-0-137-139.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-139.ec2.internal' and this object" Apr 20 23:22:00.618678 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:22:00.618638 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:22:00.620472 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:22:00.620449 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:27:00.641916 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:00.641843 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:27:00.644409 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:00.644381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:27:32.558591 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.558557 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-69865ddf88-85vq7"] Apr 20 23:27:32.561938 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.561917 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:27:32.564235 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.564210 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 23:27:32.564336 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.564214 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 23:27:32.572819 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.572798 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-69865ddf88-85vq7"] Apr 20 23:27:32.691871 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.691831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j6hm\" (UniqueName: \"kubernetes.io/projected/9105584c-7c88-41a4-a76b-782957561dc1-kube-api-access-9j6hm\") pod \"maas-api-69865ddf88-85vq7\" (UID: \"9105584c-7c88-41a4-a76b-782957561dc1\") " pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:27:32.692035 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.691890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9105584c-7c88-41a4-a76b-782957561dc1-maas-api-tls\") pod \"maas-api-69865ddf88-85vq7\" (UID: \"9105584c-7c88-41a4-a76b-782957561dc1\") " pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:27:32.793240 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.793204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j6hm\" (UniqueName: \"kubernetes.io/projected/9105584c-7c88-41a4-a76b-782957561dc1-kube-api-access-9j6hm\") pod \"maas-api-69865ddf88-85vq7\" (UID: \"9105584c-7c88-41a4-a76b-782957561dc1\") " pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:27:32.793391 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.793260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9105584c-7c88-41a4-a76b-782957561dc1-maas-api-tls\") pod \"maas-api-69865ddf88-85vq7\" (UID: \"9105584c-7c88-41a4-a76b-782957561dc1\") " pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:27:32.795634 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.795606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9105584c-7c88-41a4-a76b-782957561dc1-maas-api-tls\") pod \"maas-api-69865ddf88-85vq7\" (UID: \"9105584c-7c88-41a4-a76b-782957561dc1\") " pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:27:32.801738 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.801719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j6hm\" (UniqueName: \"kubernetes.io/projected/9105584c-7c88-41a4-a76b-782957561dc1-kube-api-access-9j6hm\") pod \"maas-api-69865ddf88-85vq7\" (UID: \"9105584c-7c88-41a4-a76b-782957561dc1\") " pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:27:32.873702 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:32.873670 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:27:33.000456 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.000431 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-69865ddf88-85vq7"] Apr 20 23:27:33.003178 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:27:33.003148 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9105584c_7c88_41a4_a76b_782957561dc1.slice/crio-105650c01e658f91a0907f199eae4f8af0e8351722a50cb10beeea5e80df6716 WatchSource:0}: Error finding container 105650c01e658f91a0907f199eae4f8af0e8351722a50cb10beeea5e80df6716: Status 404 returned error can't find the container with id 105650c01e658f91a0907f199eae4f8af0e8351722a50cb10beeea5e80df6716 Apr 20 23:27:33.004636 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.004615 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 23:27:33.696053 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.695790 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:27:33.699986 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.699689 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" Apr 20 23:27:33.702355 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.702333 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 23:27:33.702505 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.702334 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8v2gg\"" Apr 20 23:27:33.707677 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.707655 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:27:33.733296 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.733271 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:27:33.773184 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.773084 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-69865ddf88-85vq7" event={"ID":"9105584c-7c88-41a4-a76b-782957561dc1","Type":"ContainerStarted","Data":"105650c01e658f91a0907f199eae4f8af0e8351722a50cb10beeea5e80df6716"} Apr 20 23:27:33.803273 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.803236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/40639205-16c3-47ce-82b9-58cf7c3c6924-config-file\") pod \"limitador-limitador-78c99df468-jxsdb\" (UID: \"40639205-16c3-47ce-82b9-58cf7c3c6924\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" Apr 20 23:27:33.803434 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.803358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq97f\" (UniqueName: \"kubernetes.io/projected/40639205-16c3-47ce-82b9-58cf7c3c6924-kube-api-access-lq97f\") pod \"limitador-limitador-78c99df468-jxsdb\" (UID: \"40639205-16c3-47ce-82b9-58cf7c3c6924\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" Apr 20 23:27:33.904628 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.904586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/40639205-16c3-47ce-82b9-58cf7c3c6924-config-file\") pod \"limitador-limitador-78c99df468-jxsdb\" (UID: \"40639205-16c3-47ce-82b9-58cf7c3c6924\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" Apr 20 23:27:33.904835 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.904679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lq97f\" (UniqueName: \"kubernetes.io/projected/40639205-16c3-47ce-82b9-58cf7c3c6924-kube-api-access-lq97f\") pod \"limitador-limitador-78c99df468-jxsdb\" (UID: \"40639205-16c3-47ce-82b9-58cf7c3c6924\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" Apr 20 23:27:33.905353 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.905324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/40639205-16c3-47ce-82b9-58cf7c3c6924-config-file\") pod \"limitador-limitador-78c99df468-jxsdb\" (UID: \"40639205-16c3-47ce-82b9-58cf7c3c6924\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" Apr 20 23:27:33.912886 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:33.912863 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq97f\" (UniqueName: \"kubernetes.io/projected/40639205-16c3-47ce-82b9-58cf7c3c6924-kube-api-access-lq97f\") pod \"limitador-limitador-78c99df468-jxsdb\" (UID: \"40639205-16c3-47ce-82b9-58cf7c3c6924\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" Apr 20 23:27:34.014823 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:34.014738 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" Apr 20 23:27:34.156957 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:34.156816 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:27:34.159614 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:27:34.159579 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40639205_16c3_47ce_82b9_58cf7c3c6924.slice/crio-1c2fba6201fccabd1f5c96518a3bb3a5a3cb7a6b0f0c55b0a197ad8bede06916 WatchSource:0}: Error finding container 1c2fba6201fccabd1f5c96518a3bb3a5a3cb7a6b0f0c55b0a197ad8bede06916: Status 404 returned error can't find the container with id 1c2fba6201fccabd1f5c96518a3bb3a5a3cb7a6b0f0c55b0a197ad8bede06916 Apr 20 23:27:34.778632 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:34.778596 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" event={"ID":"40639205-16c3-47ce-82b9-58cf7c3c6924","Type":"ContainerStarted","Data":"1c2fba6201fccabd1f5c96518a3bb3a5a3cb7a6b0f0c55b0a197ad8bede06916"} Apr 20 23:27:35.784393 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:35.784352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-69865ddf88-85vq7" event={"ID":"9105584c-7c88-41a4-a76b-782957561dc1","Type":"ContainerStarted","Data":"152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea"} Apr 20 23:27:35.784843 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:35.784424 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:27:35.801591 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:35.801540 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-69865ddf88-85vq7" podStartSLOduration=1.573071283 podStartE2EDuration="3.801527153s" podCreationTimestamp="2026-04-20 23:27:32 +0000 UTC" firstStartedPulling="2026-04-20 23:27:33.004759549 +0000 UTC m=+932.949468980" lastFinishedPulling="2026-04-20 23:27:35.233215418 +0000 UTC m=+935.177924850" observedRunningTime="2026-04-20 23:27:35.800282223 +0000 UTC m=+935.744991678" watchObservedRunningTime="2026-04-20 23:27:35.801527153 +0000 UTC m=+935.746236604" Apr 20 23:27:37.792909 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:37.792873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" event={"ID":"40639205-16c3-47ce-82b9-58cf7c3c6924","Type":"ContainerStarted","Data":"c7e890022276c214a498a19ed0783c5ebf648b3ac6260ae4bc992a5e0737fdd1"} Apr 20 23:27:37.793331 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:37.792983 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" Apr 20 23:27:37.811288 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:37.811244 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" podStartSLOduration=1.6687754479999999 podStartE2EDuration="4.81123015s" podCreationTimestamp="2026-04-20 23:27:33 +0000 UTC" firstStartedPulling="2026-04-20 23:27:34.161824332 +0000 UTC m=+934.106533775" lastFinishedPulling="2026-04-20 23:27:37.304279044 +0000 UTC m=+937.248988477" observedRunningTime="2026-04-20 23:27:37.80962113 +0000 UTC m=+937.754330582" watchObservedRunningTime="2026-04-20 23:27:37.81123015 +0000 UTC m=+937.755939602" Apr 20 23:27:41.793275 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:41.793246 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:27:48.796816 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:27:48.796787 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-jxsdb" Apr 20 23:28:09.179363 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.179322 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-69865ddf88-85vq7"] Apr 20 23:28:09.179932 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.179591 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-69865ddf88-85vq7" podUID="9105584c-7c88-41a4-a76b-782957561dc1" containerName="maas-api" containerID="cri-o://152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea" gracePeriod=30 Apr 20 23:28:09.212079 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.212043 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:28:09.436114 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.436051 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:28:09.510183 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.510123 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9105584c-7c88-41a4-a76b-782957561dc1-maas-api-tls\") pod \"9105584c-7c88-41a4-a76b-782957561dc1\" (UID: \"9105584c-7c88-41a4-a76b-782957561dc1\") " Apr 20 23:28:09.510331 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.510198 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j6hm\" (UniqueName: \"kubernetes.io/projected/9105584c-7c88-41a4-a76b-782957561dc1-kube-api-access-9j6hm\") pod \"9105584c-7c88-41a4-a76b-782957561dc1\" (UID: \"9105584c-7c88-41a4-a76b-782957561dc1\") " Apr 20 23:28:09.512379 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.512355 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9105584c-7c88-41a4-a76b-782957561dc1-kube-api-access-9j6hm" (OuterVolumeSpecName: "kube-api-access-9j6hm") pod "9105584c-7c88-41a4-a76b-782957561dc1" (UID: "9105584c-7c88-41a4-a76b-782957561dc1"). InnerVolumeSpecName "kube-api-access-9j6hm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:28:09.512499 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.512389 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9105584c-7c88-41a4-a76b-782957561dc1-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "9105584c-7c88-41a4-a76b-782957561dc1" (UID: "9105584c-7c88-41a4-a76b-782957561dc1"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:28:09.611354 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.611320 2576 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9105584c-7c88-41a4-a76b-782957561dc1-maas-api-tls\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:28:09.611354 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.611350 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9j6hm\" (UniqueName: \"kubernetes.io/projected/9105584c-7c88-41a4-a76b-782957561dc1-kube-api-access-9j6hm\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:28:09.903076 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.903040 2576 generic.go:358] "Generic (PLEG): container finished" podID="9105584c-7c88-41a4-a76b-782957561dc1" containerID="152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea" exitCode=0 Apr 20 23:28:09.903262 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.903109 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-69865ddf88-85vq7" Apr 20 23:28:09.903262 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.903124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-69865ddf88-85vq7" event={"ID":"9105584c-7c88-41a4-a76b-782957561dc1","Type":"ContainerDied","Data":"152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea"} Apr 20 23:28:09.903262 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.903182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-69865ddf88-85vq7" event={"ID":"9105584c-7c88-41a4-a76b-782957561dc1","Type":"ContainerDied","Data":"105650c01e658f91a0907f199eae4f8af0e8351722a50cb10beeea5e80df6716"} Apr 20 23:28:09.903262 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.903197 2576 scope.go:117] "RemoveContainer" containerID="152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea" Apr 20 23:28:09.912287 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.912270 2576 scope.go:117] "RemoveContainer" containerID="152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea" Apr 20 23:28:09.912552 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:28:09.912532 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea\": container with ID starting with 152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea not found: ID does not exist" containerID="152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea" Apr 20 23:28:09.912601 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.912560 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea"} err="failed to get container status \"152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea\": rpc error: code = NotFound desc = could not find container \"152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea\": container with ID starting with 152681b1b4909fd178b810c058b8495e9254779bd2a49a52e95789d7da2f87ea not found: ID does not exist" Apr 20 23:28:09.925264 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.925230 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-69865ddf88-85vq7"] Apr 20 23:28:09.928399 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:09.928375 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-69865ddf88-85vq7"] Apr 20 23:28:10.641010 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:10.640978 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9105584c-7c88-41a4-a76b-782957561dc1" path="/var/lib/kubelet/pods/9105584c-7c88-41a4-a76b-782957561dc1/volumes" Apr 20 23:28:19.482929 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:19.482892 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:28:23.775953 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:23.775916 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:28:35.385259 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:35.385219 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:28:51.091462 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:28:51.091428 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:29:14.383951 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:29:14.383908 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:30:00.136473 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.136392 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29612130-nw4js"] Apr 20 23:30:00.136868 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.136737 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9105584c-7c88-41a4-a76b-782957561dc1" containerName="maas-api" Apr 20 23:30:00.136868 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.136748 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9105584c-7c88-41a4-a76b-782957561dc1" containerName="maas-api" Apr 20 23:30:00.136868 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.136819 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9105584c-7c88-41a4-a76b-782957561dc1" containerName="maas-api" Apr 20 23:30:00.139743 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.139725 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" Apr 20 23:30:00.141967 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.141945 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-kgbkh\"" Apr 20 23:30:00.145070 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.145048 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612130-nw4js"] Apr 20 23:30:00.236725 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.236685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr9pp\" (UniqueName: \"kubernetes.io/projected/187014df-c1ff-4b46-b151-76489adc4fcd-kube-api-access-gr9pp\") pod \"maas-api-key-cleanup-29612130-nw4js\" (UID: \"187014df-c1ff-4b46-b151-76489adc4fcd\") " pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" Apr 20 23:30:00.337632 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.337590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr9pp\" (UniqueName: \"kubernetes.io/projected/187014df-c1ff-4b46-b151-76489adc4fcd-kube-api-access-gr9pp\") pod \"maas-api-key-cleanup-29612130-nw4js\" (UID: \"187014df-c1ff-4b46-b151-76489adc4fcd\") " pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" Apr 20 23:30:00.345541 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.345515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr9pp\" (UniqueName: \"kubernetes.io/projected/187014df-c1ff-4b46-b151-76489adc4fcd-kube-api-access-gr9pp\") pod \"maas-api-key-cleanup-29612130-nw4js\" (UID: \"187014df-c1ff-4b46-b151-76489adc4fcd\") " pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" Apr 20 23:30:00.452660 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.452581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" Apr 20 23:30:00.571558 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:00.571525 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612130-nw4js"] Apr 20 23:30:00.573768 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:30:00.573741 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod187014df_c1ff_4b46_b151_76489adc4fcd.slice/crio-ed126c952f3509217af68680f508b90668c43b3defc9f01401fc893ac880623d WatchSource:0}: Error finding container ed126c952f3509217af68680f508b90668c43b3defc9f01401fc893ac880623d: Status 404 returned error can't find the container with id ed126c952f3509217af68680f508b90668c43b3defc9f01401fc893ac880623d Apr 20 23:30:01.284212 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:01.284182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" event={"ID":"187014df-c1ff-4b46-b151-76489adc4fcd","Type":"ContainerStarted","Data":"ed126c952f3509217af68680f508b90668c43b3defc9f01401fc893ac880623d"} Apr 20 23:30:01.784940 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:01.784859 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:30:02.289849 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:02.289818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" event={"ID":"187014df-c1ff-4b46-b151-76489adc4fcd","Type":"ContainerStarted","Data":"86e1416c36104daae4b6e571c96945ecd7672c3dbeb3b3b762e9403d19c80483"} Apr 20 23:30:02.308320 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:02.308276 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" podStartSLOduration=1.618452652 podStartE2EDuration="2.308262661s" podCreationTimestamp="2026-04-20 23:30:00 +0000 UTC" firstStartedPulling="2026-04-20 23:30:00.575404176 +0000 UTC m=+1080.520113605" lastFinishedPulling="2026-04-20 23:30:01.265214185 +0000 UTC m=+1081.209923614" observedRunningTime="2026-04-20 23:30:02.306797694 +0000 UTC m=+1082.251507174" watchObservedRunningTime="2026-04-20 23:30:02.308262661 +0000 UTC m=+1082.252972111" Apr 20 23:30:11.583565 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:11.583522 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:30:20.781681 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:20.781645 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:30:22.355868 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:22.355834 2576 generic.go:358] "Generic (PLEG): container finished" podID="187014df-c1ff-4b46-b151-76489adc4fcd" containerID="86e1416c36104daae4b6e571c96945ecd7672c3dbeb3b3b762e9403d19c80483" exitCode=6 Apr 20 23:30:22.356298 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:22.355873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" event={"ID":"187014df-c1ff-4b46-b151-76489adc4fcd","Type":"ContainerDied","Data":"86e1416c36104daae4b6e571c96945ecd7672c3dbeb3b3b762e9403d19c80483"} Apr 20 23:30:22.356298 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:22.356205 2576 scope.go:117] "RemoveContainer" containerID="86e1416c36104daae4b6e571c96945ecd7672c3dbeb3b3b762e9403d19c80483" Apr 20 23:30:23.359866 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:23.359834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" event={"ID":"187014df-c1ff-4b46-b151-76489adc4fcd","Type":"ContainerStarted","Data":"ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21"} Apr 20 23:30:30.795865 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:30.795829 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:30:40.279312 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:40.279277 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:30:43.431479 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:43.431446 2576 generic.go:358] "Generic (PLEG): container finished" podID="187014df-c1ff-4b46-b151-76489adc4fcd" containerID="ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21" exitCode=6 Apr 20 23:30:43.431952 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:43.431515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" event={"ID":"187014df-c1ff-4b46-b151-76489adc4fcd","Type":"ContainerDied","Data":"ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21"} Apr 20 23:30:43.431952 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:43.431569 2576 scope.go:117] "RemoveContainer" containerID="86e1416c36104daae4b6e571c96945ecd7672c3dbeb3b3b762e9403d19c80483" Apr 20 23:30:43.431952 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:43.431871 2576 scope.go:117] "RemoveContainer" containerID="ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21" Apr 20 23:30:43.432100 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:30:43.432078 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29612130-nw4js_opendatahub(187014df-c1ff-4b46-b151-76489adc4fcd)\"" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" podUID="187014df-c1ff-4b46-b151-76489adc4fcd" Apr 20 23:30:50.177181 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:50.177133 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:30:56.636698 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:56.636671 2576 scope.go:117] "RemoveContainer" containerID="ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21" Apr 20 23:30:57.482335 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:57.482296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" event={"ID":"187014df-c1ff-4b46-b151-76489adc4fcd","Type":"ContainerStarted","Data":"c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14"} Apr 20 23:30:57.660359 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:57.660326 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612130-nw4js"] Apr 20 23:30:58.486167 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:30:58.486108 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" podUID="187014df-c1ff-4b46-b151-76489adc4fcd" containerName="cleanup" containerID="cri-o://c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14" gracePeriod=30 Apr 20 23:31:17.431436 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.431410 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" Apr 20 23:31:17.486982 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.486951 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr9pp\" (UniqueName: \"kubernetes.io/projected/187014df-c1ff-4b46-b151-76489adc4fcd-kube-api-access-gr9pp\") pod \"187014df-c1ff-4b46-b151-76489adc4fcd\" (UID: \"187014df-c1ff-4b46-b151-76489adc4fcd\") " Apr 20 23:31:17.488964 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.488932 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187014df-c1ff-4b46-b151-76489adc4fcd-kube-api-access-gr9pp" (OuterVolumeSpecName: "kube-api-access-gr9pp") pod "187014df-c1ff-4b46-b151-76489adc4fcd" (UID: "187014df-c1ff-4b46-b151-76489adc4fcd"). InnerVolumeSpecName "kube-api-access-gr9pp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:31:17.551197 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.551162 2576 generic.go:358] "Generic (PLEG): container finished" podID="187014df-c1ff-4b46-b151-76489adc4fcd" containerID="c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14" exitCode=6 Apr 20 23:31:17.551355 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.551271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" event={"ID":"187014df-c1ff-4b46-b151-76489adc4fcd","Type":"ContainerDied","Data":"c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14"} Apr 20 23:31:17.551355 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.551295 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" Apr 20 23:31:17.551355 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.551308 2576 scope.go:117] "RemoveContainer" containerID="c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14" Apr 20 23:31:17.551465 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.551299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29612130-nw4js" event={"ID":"187014df-c1ff-4b46-b151-76489adc4fcd","Type":"ContainerDied","Data":"ed126c952f3509217af68680f508b90668c43b3defc9f01401fc893ac880623d"} Apr 20 23:31:17.559347 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.559296 2576 scope.go:117] "RemoveContainer" containerID="ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21" Apr 20 23:31:17.566444 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.566425 2576 scope.go:117] "RemoveContainer" containerID="c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14" Apr 20 23:31:17.566676 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:31:17.566656 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14\": container with ID starting with c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14 not found: ID does not exist" containerID="c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14" Apr 20 23:31:17.566732 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.566685 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14"} err="failed to get container status \"c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14\": rpc error: code = NotFound desc = could not find container \"c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14\": container with ID starting with c8d16aafebb7cb4ecacb7e7967e44d9036500af3951201f7f3c80fd3b5090a14 not found: ID does not exist" Apr 20 23:31:17.566732 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.566704 2576 scope.go:117] "RemoveContainer" containerID="ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21" Apr 20 23:31:17.566935 ip-10-0-137-139 kubenswrapper[2576]: E0420 23:31:17.566917 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21\": container with ID starting with ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21 not found: ID does not exist" containerID="ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21" Apr 20 23:31:17.566977 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.566941 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21"} err="failed to get container status \"ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21\": rpc error: code = NotFound desc = could not find container \"ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21\": container with ID starting with ac32f2309d09ca31338ec2ce55fe4805e4c2933435f11168ac4eeb8eb316de21 not found: ID does not exist" Apr 20 23:31:17.571768 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.571743 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612130-nw4js"] Apr 20 23:31:17.573436 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.573416 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29612130-nw4js"] Apr 20 23:31:17.587924 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:17.587906 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gr9pp\" (UniqueName: \"kubernetes.io/projected/187014df-c1ff-4b46-b151-76489adc4fcd-kube-api-access-gr9pp\") on node \"ip-10-0-137-139.ec2.internal\" DevicePath \"\"" Apr 20 23:31:18.640395 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:18.640365 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187014df-c1ff-4b46-b151-76489adc4fcd" path="/var/lib/kubelet/pods/187014df-c1ff-4b46-b151-76489adc4fcd/volumes" Apr 20 23:31:20.888676 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:20.888597 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:31:24.787894 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:24.787859 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxsdb"] Apr 20 23:31:56.924543 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:56.924504 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-d85d4_7540190a-b508-4374-afff-643e7631a11f/manager/0.log" Apr 20 23:31:57.274426 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:57.274346 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-vx2sj_12eabfbe-7953-48de-bfb9-65826a0c3846/manager/2.log" Apr 20 23:31:57.393882 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:57.393853 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d79c565b7-cfmkq_a377e8d5-018f-4c51-8383-d70685804f26/manager/0.log" Apr 20 23:31:59.108075 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:59.108034 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-b2dp8_ae6ac25b-f6ca-4112-a0ad-89e1521d85b9/manager/0.log" Apr 20 23:31:59.679113 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:31:59.679083 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-jxsdb_40639205-16c3-47ce-82b9-58cf7c3c6924/limitador/0.log" Apr 20 23:32:00.275571 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:00.275535 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rhjnq_4d7994aa-5e1d-4085-bed6-389da1901c3c/discovery/0.log" Apr 20 23:32:00.493057 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:00.493024 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-74c79b5d98-tlqs4_79ab33d4-b8d4-4741-9be4-b52e8a781c2f/kube-auth-proxy/0.log" Apr 20 23:32:00.663968 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:00.663945 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:32:00.667283 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:00.667262 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:32:00.724363 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:00.724339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-799dc56db8-g6d5c_923ad550-9c0c-4c01-bd51-00fb990a6a8d/router/0.log" Apr 20 23:32:08.156956 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:08.156920 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-c6jf8_9dc3bbec-6557-4eef-8534-77d4570a546d/global-pull-secret-syncer/0.log" Apr 20 23:32:08.268155 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:08.268115 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9pvbx_c89ec650-7091-4fbe-a329-ba849fc5e589/konnectivity-agent/0.log" Apr 20 23:32:08.364099 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:08.364071 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-139.ec2.internal_ff565a32ff02ffe7c9262acb131f7f92/haproxy/0.log" Apr 20 23:32:12.776528 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:12.776478 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-b2dp8_ae6ac25b-f6ca-4112-a0ad-89e1521d85b9/manager/0.log" Apr 20 23:32:12.913400 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:12.913366 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-jxsdb_40639205-16c3-47ce-82b9-58cf7c3c6924/limitador/0.log" Apr 20 23:32:14.492503 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:14.492424 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-fm6cg_dc35f408-9d03-4b9f-b1fd-a285d5b9d26b/cluster-monitoring-operator/0.log" Apr 20 23:32:14.716520 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:14.716490 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dpkzg_02a27fe0-0637-4911-90ef-32cb592ad9f4/node-exporter/0.log" Apr 20 23:32:14.734655 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:14.734632 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dpkzg_02a27fe0-0637-4911-90ef-32cb592ad9f4/kube-rbac-proxy/0.log" Apr 20 23:32:14.757531 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:14.757470 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dpkzg_02a27fe0-0637-4911-90ef-32cb592ad9f4/init-textfile/0.log" Apr 20 23:32:16.351781 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.351746 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-h242g_b4b471b6-c5bc-4c1c-b7da-3a4ce9aaa726/networking-console-plugin/0.log" Apr 20 23:32:16.438565 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.438533 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv"] Apr 20 23:32:16.438871 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.438859 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="187014df-c1ff-4b46-b151-76489adc4fcd" containerName="cleanup" Apr 20 23:32:16.438923 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.438874 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="187014df-c1ff-4b46-b151-76489adc4fcd" containerName="cleanup" Apr 20 23:32:16.438923 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.438882 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="187014df-c1ff-4b46-b151-76489adc4fcd" containerName="cleanup" Apr 20 23:32:16.438923 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.438888 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="187014df-c1ff-4b46-b151-76489adc4fcd" containerName="cleanup" Apr 20 23:32:16.439012 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.438954 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="187014df-c1ff-4b46-b151-76489adc4fcd" containerName="cleanup" Apr 20 23:32:16.439012 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.438963 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="187014df-c1ff-4b46-b151-76489adc4fcd" containerName="cleanup" Apr 20 23:32:16.443038 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.443020 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.445450 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.445425 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2tt9\"/\"kube-root-ca.crt\"" Apr 20 23:32:16.446646 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.446620 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d2tt9\"/\"default-dockercfg-5mhdc\"" Apr 20 23:32:16.446646 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.446620 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2tt9\"/\"openshift-service-ca.crt\"" Apr 20 23:32:16.452113 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.452087 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv"] Apr 20 23:32:16.487960 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.487928 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-podres\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.488127 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.487971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbvwd\" (UniqueName: \"kubernetes.io/projected/f073bee4-6ccc-40cd-9941-251249609b81-kube-api-access-vbvwd\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.488127 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.488039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-proc\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.488127 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.488094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-lib-modules\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.488127 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.488113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-sys\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.589157 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.589113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-podres\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.589300 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.589173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbvwd\" (UniqueName: \"kubernetes.io/projected/f073bee4-6ccc-40cd-9941-251249609b81-kube-api-access-vbvwd\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.589300 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.589203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-proc\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.589300 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.589220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-lib-modules\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.589300 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.589238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-sys\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.589462 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.589301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-podres\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.589462 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.589319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-sys\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.589462 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.589323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-proc\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.589462 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.589381 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f073bee4-6ccc-40cd-9941-251249609b81-lib-modules\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.596829 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.596802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbvwd\" (UniqueName: \"kubernetes.io/projected/f073bee4-6ccc-40cd-9941-251249609b81-kube-api-access-vbvwd\") pod \"perf-node-gather-daemonset-bqqjv\" (UID: \"f073bee4-6ccc-40cd-9941-251249609b81\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.754451 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.754371 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:16.876290 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.876258 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv"] Apr 20 23:32:16.879320 ip-10-0-137-139 kubenswrapper[2576]: W0420 23:32:16.879290 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf073bee4_6ccc_40cd_9941_251249609b81.slice/crio-ee6b2d18ef57b590c8c830f5994c404ca8b464f0e5f68725b80e223112694061 WatchSource:0}: Error finding container ee6b2d18ef57b590c8c830f5994c404ca8b464f0e5f68725b80e223112694061: Status 404 returned error can't find the container with id ee6b2d18ef57b590c8c830f5994c404ca8b464f0e5f68725b80e223112694061 Apr 20 23:32:16.909704 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:16.909681 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8w8m8_ee9b8fd0-f305-4d93-a360-f2430b0b36fb/console-operator/0.log" Apr 20 23:32:17.752154 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:17.752107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" event={"ID":"f073bee4-6ccc-40cd-9941-251249609b81","Type":"ContainerStarted","Data":"d60506f83372f2c5d58f3834be0e5102ab1bbce9aefa0cca7b4c1637cc919a54"} Apr 20 23:32:17.752154 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:17.752157 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" event={"ID":"f073bee4-6ccc-40cd-9941-251249609b81","Type":"ContainerStarted","Data":"ee6b2d18ef57b590c8c830f5994c404ca8b464f0e5f68725b80e223112694061"} Apr 20 23:32:17.752604 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:17.752234 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:17.769567 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:17.769524 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" podStartSLOduration=1.769510438 podStartE2EDuration="1.769510438s" podCreationTimestamp="2026-04-20 23:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:32:17.767878236 +0000 UTC m=+1217.712587713" watchObservedRunningTime="2026-04-20 23:32:17.769510438 +0000 UTC m=+1217.714219889" Apr 20 23:32:17.986001 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:17.985967 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-xwfnk_c293a8dc-a84c-49ef-a91c-fc1f64604bbe/volume-data-source-validator/0.log" Apr 20 23:32:18.827823 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:18.827791 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dzpv8_40424f06-d2df-4a03-bc2c-0af9d8b4e184/dns/0.log" Apr 20 23:32:18.848384 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:18.848358 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dzpv8_40424f06-d2df-4a03-bc2c-0af9d8b4e184/kube-rbac-proxy/0.log" Apr 20 23:32:18.914285 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:18.914251 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jsb2w_392462d6-20a2-4842-bcbc-129ba91961ef/dns-node-resolver/0.log" Apr 20 23:32:19.442913 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:19.442890 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s6hls_1652a387-e617-4579-bb1d-4fab03dacaed/node-ca/0.log" Apr 20 23:32:20.295839 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:20.295804 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rhjnq_4d7994aa-5e1d-4085-bed6-389da1901c3c/discovery/0.log" Apr 20 23:32:20.335437 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:20.335409 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-74c79b5d98-tlqs4_79ab33d4-b8d4-4741-9be4-b52e8a781c2f/kube-auth-proxy/0.log" Apr 20 23:32:20.385851 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:20.385827 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-799dc56db8-g6d5c_923ad550-9c0c-4c01-bd51-00fb990a6a8d/router/0.log" Apr 20 23:32:20.856602 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:20.856567 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2mlnp_204aec3c-9787-4646-abd7-68cf7063e0c5/serve-healthcheck-canary/0.log" Apr 20 23:32:21.316892 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:21.316809 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-d764h_0337f295-0371-4489-b8f0-7d5728373a37/insights-operator/0.log" Apr 20 23:32:21.335658 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:21.335629 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4szd2_227b4df1-51fd-4833-bed6-8b6c220bea00/kube-rbac-proxy/0.log" Apr 20 23:32:21.356134 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:21.356108 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4szd2_227b4df1-51fd-4833-bed6-8b6c220bea00/exporter/0.log" Apr 20 23:32:21.376019 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:21.375996 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4szd2_227b4df1-51fd-4833-bed6-8b6c220bea00/extractor/0.log" Apr 20 23:32:23.294366 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:23.294323 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-d85d4_7540190a-b508-4374-afff-643e7631a11f/manager/0.log" Apr 20 23:32:23.359068 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:23.359017 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-vx2sj_12eabfbe-7953-48de-bfb9-65826a0c3846/manager/1.log" Apr 20 23:32:23.378461 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:23.378436 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-vx2sj_12eabfbe-7953-48de-bfb9-65826a0c3846/manager/2.log" Apr 20 23:32:23.429779 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:23.429748 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d79c565b7-cfmkq_a377e8d5-018f-4c51-8383-d70685804f26/manager/0.log" Apr 20 23:32:23.765388 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:23.765359 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-bqqjv" Apr 20 23:32:28.829324 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:28.829293 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-pr6j7_849d7e34-d847-4179-b6ac-2c2766dce9e0/migrator/0.log" Apr 20 23:32:28.848988 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:28.848958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-pr6j7_849d7e34-d847-4179-b6ac-2c2766dce9e0/graceful-termination/0.log" Apr 20 23:32:29.187971 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:29.187933 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-xg99l_fa5675ad-e501-4f6b-a732-4d1db6454f9a/kube-storage-version-migrator-operator/1.log" Apr 20 23:32:29.189766 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:29.189742 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-xg99l_fa5675ad-e501-4f6b-a732-4d1db6454f9a/kube-storage-version-migrator-operator/0.log" Apr 20 23:32:30.411256 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:30.411223 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdvqj_97d5b486-1141-4ce1-b800-263ccf62a8cd/kube-multus-additional-cni-plugins/0.log" Apr 20 23:32:30.430283 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:30.430254 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdvqj_97d5b486-1141-4ce1-b800-263ccf62a8cd/egress-router-binary-copy/0.log" Apr 20 23:32:30.449884 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:30.449856 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdvqj_97d5b486-1141-4ce1-b800-263ccf62a8cd/cni-plugins/0.log" Apr 20 23:32:30.469306 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:30.469284 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdvqj_97d5b486-1141-4ce1-b800-263ccf62a8cd/bond-cni-plugin/0.log" Apr 20 23:32:30.487267 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:30.487245 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdvqj_97d5b486-1141-4ce1-b800-263ccf62a8cd/routeoverride-cni/0.log" Apr 20 23:32:30.504854 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:30.504835 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdvqj_97d5b486-1141-4ce1-b800-263ccf62a8cd/whereabouts-cni-bincopy/0.log" Apr 20 23:32:30.523660 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:30.523637 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdvqj_97d5b486-1141-4ce1-b800-263ccf62a8cd/whereabouts-cni/0.log" Apr 20 23:32:30.556582 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:30.556554 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnjdb_7577b13a-1450-42fb-aa2f-4374c2a72406/kube-multus/0.log" Apr 20 23:32:30.684566 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:30.684476 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rvb5h_e6e1d353-f530-4ad5-a0ae-b436e227eb58/network-metrics-daemon/0.log" Apr 20 23:32:30.700247 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:30.700217 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rvb5h_e6e1d353-f530-4ad5-a0ae-b436e227eb58/kube-rbac-proxy/0.log" Apr 20 23:32:31.481557 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:31.481526 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-controller/0.log" Apr 20 23:32:31.497839 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:31.497815 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/0.log" Apr 20 23:32:31.508010 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:31.507991 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovn-acl-logging/1.log" Apr 20 23:32:31.528739 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:31.528717 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/kube-rbac-proxy-node/0.log" Apr 20 23:32:31.548641 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:31.548622 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 23:32:31.564494 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:31.564471 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/northd/0.log" Apr 20 23:32:31.582512 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:31.582494 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/nbdb/0.log" Apr 20 23:32:31.600924 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:31.600877 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/sbdb/0.log" Apr 20 23:32:31.750931 ip-10-0-137-139 kubenswrapper[2576]: I0420 23:32:31.750853 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6675l_65a2a89a-c0bf-4140-bd21-e8249221ca05/ovnkube-controller/0.log"