Apr 16 14:50:07.635518 ip-10-0-139-101 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 14:50:07.635528 ip-10-0-139-101 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 14:50:07.635535 ip-10-0-139-101 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 14:50:07.635789 ip-10-0-139-101 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 14:50:17.780743 ip-10-0-139-101 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 14:50:17.780758 ip-10-0-139-101 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 3488c76cc1b8403ca41142c32fb30f19 -- Apr 16 14:52:30.378152 ip-10-0-139-101 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:30.890556 ip-10-0-139-101 kubenswrapper[2582]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:30.890556 ip-10-0-139-101 kubenswrapper[2582]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:30.890556 ip-10-0-139-101 kubenswrapper[2582]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:30.890556 ip-10-0-139-101 kubenswrapper[2582]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:30.890556 ip-10-0-139-101 kubenswrapper[2582]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:30.892176 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.892088 2582 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:30.899672 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899652 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:30.899672 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899668 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:30.899672 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899676 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899681 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899686 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899690 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899694 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899699 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899702 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899706 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899718 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899723 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899727 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899731 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899735 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899739 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899743 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899747 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899751 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899755 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899759 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899763 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:30.899893 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899766 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899770 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899774 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899778 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899781 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899785 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899790 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899794 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899799 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899803 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899807 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899811 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899815 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899819 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899843 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899848 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899852 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899856 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899860 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899864 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:30.900661 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899868 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899874 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899878 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899883 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899887 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899892 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899896 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899900 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899905 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899909 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899913 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899917 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899921 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899926 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899932 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899936 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899940 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899944 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899949 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:30.901517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899953 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899958 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899961 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899965 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899969 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899973 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899978 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899984 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899988 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899992 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.899997 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900001 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900005 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900009 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900013 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900018 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900022 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900027 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900031 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900036 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:30.902070 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900040 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900044 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900048 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900056 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.900062 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901493 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901507 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901512 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901515 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901518 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901521 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901523 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901526 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901529 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901532 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901535 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901537 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901540 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901543 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901546 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:30.902552 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901549 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901551 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901554 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901557 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901560 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901563 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901565 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901568 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901570 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901576 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901583 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901587 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901591 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901594 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901597 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901600 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901603 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901606 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:30.903041 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901609 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901612 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901615 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901618 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901620 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901623 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901627 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901630 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901632 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901635 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901637 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901640 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901642 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901645 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901647 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901650 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901652 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901655 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901657 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901660 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:30.903484 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901663 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901665 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901668 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901670 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901673 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901675 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901677 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901680 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901682 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901684 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901687 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901689 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901693 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901695 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901698 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901700 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901702 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901705 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901707 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901710 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:30.904018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901712 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901714 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901717 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901719 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901722 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901725 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901727 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901730 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901732 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901735 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901737 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901740 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.901742 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902345 2582 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902354 2582 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902361 2582 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902365 2582 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902370 2582 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902373 2582 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902378 2582 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902382 2582 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:30.904508 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902385 2582 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902388 2582 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902392 2582 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902396 2582 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902399 2582 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902402 2582 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902405 2582 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902408 2582 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902411 2582 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902414 2582 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902416 2582 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902421 2582 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902423 2582 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902427 2582 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902429 2582 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902433 2582 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902437 2582 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902441 2582 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902444 2582 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902447 2582 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902451 2582 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902454 2582 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902456 2582 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902460 2582 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902463 2582 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:30.905035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902467 2582 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902470 2582 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902473 2582 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902475 2582 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902478 2582 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902481 2582 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902486 2582 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902489 2582 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902492 2582 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902495 2582 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902498 2582 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902502 2582 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902505 2582 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902508 2582 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902511 2582 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902514 2582 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902517 2582 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902520 2582 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902523 2582 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902525 2582 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902528 2582 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902531 2582 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902534 2582 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902537 2582 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902540 2582 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:30.905624 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902544 2582 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902547 2582 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902550 2582 flags.go:64] FLAG: --help="false" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902553 2582 flags.go:64] FLAG: --hostname-override="ip-10-0-139-101.ec2.internal" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902556 2582 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902559 2582 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902562 2582 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902565 2582 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902569 2582 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902571 2582 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902574 2582 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902577 2582 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902579 2582 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902582 2582 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902585 2582 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902588 2582 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902591 2582 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902595 2582 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902598 2582 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902601 2582 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902604 2582 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902607 2582 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902610 2582 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902613 2582 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:30.906249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902618 2582 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902621 2582 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902624 2582 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902626 2582 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902629 2582 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902632 2582 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902635 2582 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902638 2582 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902643 2582 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902646 2582 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902653 2582 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902657 2582 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902660 2582 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902662 2582 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902665 2582 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902668 2582 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902671 2582 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902674 2582 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902681 2582 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902685 2582 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902688 2582 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902695 2582 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902698 2582 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:30.906813 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902703 2582 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902706 2582 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902709 2582 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902712 2582 flags.go:64] FLAG: --port="10250" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902715 2582 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902718 2582 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-061b8734f5fbd5605" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902721 2582 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902724 2582 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902727 2582 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902730 2582 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902733 2582 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902737 2582 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902739 2582 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902742 2582 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902745 2582 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902748 2582 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902751 2582 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902754 2582 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902756 2582 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902760 2582 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902763 2582 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902766 2582 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902769 2582 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902772 2582 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902775 2582 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902778 2582 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:30.907426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902780 2582 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902783 2582 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902786 2582 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902789 2582 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902791 2582 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902796 2582 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902799 2582 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902801 2582 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902804 2582 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902810 2582 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902813 2582 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902815 2582 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902820 2582 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902838 2582 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902840 2582 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902843 2582 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902846 2582 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902849 2582 flags.go:64] FLAG: --v="2" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902853 2582 flags.go:64] FLAG: --version="false" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902857 2582 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902861 2582 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.902864 2582 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902954 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902958 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:30.908062 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902961 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902964 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902967 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902970 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902972 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902975 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902977 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902980 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902983 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902985 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902988 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902991 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902993 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.902997 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903000 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903003 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903005 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903008 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903010 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903013 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:30.908626 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903015 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903018 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903022 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903025 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903028 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903031 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903033 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903037 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903040 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903044 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903046 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903049 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903052 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903055 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903057 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903060 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903062 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903065 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903067 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:30.909161 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903070 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903072 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903075 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903078 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903080 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903083 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903086 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903088 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903091 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903093 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903096 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903098 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903100 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903103 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903105 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903108 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903110 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903112 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903115 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903117 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:30.909650 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903119 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903122 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903124 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903127 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903129 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903133 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903135 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903138 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903140 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903143 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903145 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903148 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903150 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903152 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903155 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903157 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903160 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903162 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903165 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:30.910238 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903167 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:30.910703 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903170 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:30.910703 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903173 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:30.910703 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903175 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:30.910703 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903177 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:30.910703 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.903180 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:30.910703 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.903928 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.910769 2582 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.910786 2582 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910854 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910860 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910863 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910866 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910869 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910872 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910875 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910877 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910880 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:30.910878 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910882 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910885 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910888 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910891 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910894 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910897 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910899 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910902 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910905 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910907 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910910 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910913 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910917 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910921 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910923 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910926 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910928 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910931 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910933 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:30.911172 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910936 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910938 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910941 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910943 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910946 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910949 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910951 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910954 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910956 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910958 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910961 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910963 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910966 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910969 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910971 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910974 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910977 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910979 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910983 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910985 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:30.911632 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910988 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910990 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910993 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910995 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.910999 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911002 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911005 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911008 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911011 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911013 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911016 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911019 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911021 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911024 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911026 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911029 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911031 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911033 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911036 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911039 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:30.912145 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911041 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911043 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911046 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911048 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911051 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911053 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911056 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911059 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911061 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911064 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911066 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911069 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911072 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911074 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911077 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911079 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911082 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:30.912648 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911084 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.911089 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911178 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911182 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911185 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911188 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911191 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911193 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911196 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911199 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911201 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911204 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911207 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911210 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911212 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911215 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:30.913128 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911217 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911220 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911222 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911224 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911227 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911230 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911233 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911236 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911238 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911241 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911243 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911246 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911249 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911251 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911254 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911256 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911258 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911261 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911263 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911266 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:30.913517 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911268 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911271 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911273 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911275 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911278 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911281 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911283 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911285 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911288 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911290 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911294 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911298 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911302 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911305 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911307 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911310 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911313 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911316 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911319 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:30.914025 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911322 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911325 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911327 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911330 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911332 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911335 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911338 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911340 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911342 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911345 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911347 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911350 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911352 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911354 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911357 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911359 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911362 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911364 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911367 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911369 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:30.914521 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911371 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911374 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911377 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911379 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911381 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911384 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911386 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911388 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911391 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911393 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911396 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911399 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:30.911401 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.911406 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.912097 2582 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:30.915038 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.914861 2582 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:30.915778 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.915765 2582 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:30.915874 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.915852 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:30.915974 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.915897 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:30.941378 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.941364 2582 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:30.945104 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.945085 2582 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:30.956878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.956862 2582 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:30.961508 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.961493 2582 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:30.962650 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.962632 2582 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:30.964773 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.964750 2582 fs.go:135] Filesystem UUIDs: map[289f421d-38ab-4f77-a4b8-284fc9364463:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e7f6c26b-d148-49fc-9d11-f5107737ac13:/dev/nvme0n1p3] Apr 16 14:52:30.964869 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.964770 2582 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:30.967909 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.967890 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:30.973579 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.973467 2582 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:30.968311439 +0000 UTC m=+0.464638658 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100466 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23a3f48006d65e256adad9ed6c77de SystemUUID:ec23a3f4-8006-d65e-256a-dad9ed6c77de BootID:3488c76c-c1b8-403c-a411-42c32fb30f19 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f5:3e:57:82:09 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f5:3e:57:82:09 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:56:bc:33:90:12 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:30.973579 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.973572 2582 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:30.973699 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.973649 2582 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:30.980126 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.980102 2582 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:30.980263 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.980127 2582 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-101.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:30.980311 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.980273 2582 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:30.980311 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.980282 2582 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:30.980311 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.980295 2582 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:30.981038 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.981027 2582 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:30.983221 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.983211 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:30.983323 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.983314 2582 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:30.986222 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.986213 2582 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:30.986294 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.986232 2582 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:30.986294 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.986243 2582 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:30.986294 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.986252 2582 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:30.986294 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.986260 2582 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:30.987862 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.987850 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:30.987915 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.987868 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:30.989686 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.989668 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tgcjm" Apr 16 14:52:30.993224 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.993210 2582 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:30.995148 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.995134 2582 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:30.996293 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996282 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:30.996337 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996299 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:30.996337 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996305 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:30.996337 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996312 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:30.996337 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996317 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:30.996337 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996328 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:30.996337 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996337 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:30.996491 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996346 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:30.996491 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996354 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:30.996491 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996359 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:30.996491 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996376 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:30.996491 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.996385 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:30.997262 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.997253 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:30.997262 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.997262 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:30.999600 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:30.999531 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:52:30.999600 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:30.999570 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-101.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:52:30.999971 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:30.999954 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tgcjm" Apr 16 14:52:31.001445 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.001431 2582 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:31.001481 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.001467 2582 server.go:1295] "Started kubelet" Apr 16 14:52:31.001586 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.001559 2582 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:31.001675 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.001562 2582 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:31.001675 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.001625 2582 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:31.003390 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.003365 2582 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:31.003538 ip-10-0-139-101 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:31.004687 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.004672 2582 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:31.010202 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.010183 2582 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:31.010283 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.010186 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:31.011583 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.010891 2582 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:31.011583 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.010917 2582 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:31.011583 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.010927 2582 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:31.011583 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.010977 2582 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:31.011583 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.011002 2582 factory.go:55] Registering systemd factory Apr 16 14:52:31.011583 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.011012 2582 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:31.011819 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.011684 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:31.013543 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.013516 2582 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:31.013543 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.013528 2582 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:31.013671 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.013606 2582 factory.go:153] Registering CRI-O factory Apr 16 14:52:31.013671 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.013619 2582 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:31.013671 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.013659 2582 factory.go:103] Registering Raw factory Apr 16 14:52:31.013805 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.013674 2582 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:31.014690 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.014672 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:31.016070 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.016050 2582 manager.go:319] Starting recovery of all containers Apr 16 14:52:31.022004 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.021988 2582 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-101.ec2.internal" not found Apr 16 14:52:31.022096 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.021983 2582 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-101.ec2.internal\" not found" node="ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.025814 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.025691 2582 manager.go:324] Recovery completed Apr 16 14:52:31.030084 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.030071 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.032315 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.032299 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.032393 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.032331 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.032393 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.032345 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.032808 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.032793 2582 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:31.032808 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.032805 2582 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:31.032904 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.032835 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:31.036569 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.036557 2582 policy_none.go:49] "None policy: Start" Apr 16 14:52:31.036615 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.036574 2582 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:31.036615 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.036584 2582 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:31.040773 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.040758 2582 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-101.ec2.internal" not found Apr 16 14:52:31.069384 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.069368 2582 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:31.069448 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.069394 2582 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:31.069448 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.069406 2582 server.go:85] "Starting device plugin registration server" Apr 16 14:52:31.069622 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.069607 2582 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:31.069699 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.069624 2582 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:31.069763 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.069749 2582 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:31.069857 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.069843 2582 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:31.069857 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.069858 2582 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:31.070246 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.070223 2582 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:31.070329 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.070272 2582 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:31.098964 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.098948 2582 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-101.ec2.internal" not found Apr 16 14:52:31.155891 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.155841 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:31.157159 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.157139 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:31.157243 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.157164 2582 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:31.157243 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.157181 2582 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:31.157243 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.157192 2582 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:31.157243 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.157224 2582 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:31.159725 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.159701 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:31.169764 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.169749 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.171184 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.171169 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.171239 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.171198 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.171239 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.171210 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.171239 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.171237 2582 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.179628 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.179614 2582 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.179701 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.179633 2582 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-101.ec2.internal\": node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:31.192034 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.192012 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:31.257533 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.257500 2582 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-101.ec2.internal"] Apr 16 14:52:31.257582 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.257567 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.258326 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.258310 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.258384 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.258338 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.258384 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.258351 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.259603 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.259591 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.259767 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.259752 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.259841 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.259786 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.260793 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.260777 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.260878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.260797 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.260878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.260811 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.260878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.260847 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.260878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.260870 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.260878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.260880 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.262503 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.262489 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.262579 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.262513 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:31.263165 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.263145 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:31.263253 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.263176 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:31.263253 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.263191 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:31.286488 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.286463 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-101.ec2.internal\" not found" node="ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.290704 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.290688 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-101.ec2.internal\" not found" node="ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.292659 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.292644 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:31.316383 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.316361 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43fc43a7ac83103e37932dc8a457fdd4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal\" (UID: \"43fc43a7ac83103e37932dc8a457fdd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.316455 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.316387 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43fc43a7ac83103e37932dc8a457fdd4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal\" (UID: \"43fc43a7ac83103e37932dc8a457fdd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.316455 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.316406 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57029cb5a3dfbbc5e89140d02a0a24ef-config\") pod \"kube-apiserver-proxy-ip-10-0-139-101.ec2.internal\" (UID: \"57029cb5a3dfbbc5e89140d02a0a24ef\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.393532 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.393508 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:31.417344 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.417296 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43fc43a7ac83103e37932dc8a457fdd4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal\" (UID: \"43fc43a7ac83103e37932dc8a457fdd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.417344 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.417323 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57029cb5a3dfbbc5e89140d02a0a24ef-config\") pod \"kube-apiserver-proxy-ip-10-0-139-101.ec2.internal\" (UID: \"57029cb5a3dfbbc5e89140d02a0a24ef\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.417344 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.417340 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43fc43a7ac83103e37932dc8a457fdd4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal\" (UID: \"43fc43a7ac83103e37932dc8a457fdd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.417507 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.417375 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/43fc43a7ac83103e37932dc8a457fdd4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal\" (UID: \"43fc43a7ac83103e37932dc8a457fdd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.417507 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.417410 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57029cb5a3dfbbc5e89140d02a0a24ef-config\") pod \"kube-apiserver-proxy-ip-10-0-139-101.ec2.internal\" (UID: \"57029cb5a3dfbbc5e89140d02a0a24ef\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.417507 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.417410 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/43fc43a7ac83103e37932dc8a457fdd4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal\" (UID: \"43fc43a7ac83103e37932dc8a457fdd4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.493746 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.493712 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:31.589198 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.589171 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.592980 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.592964 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-101.ec2.internal" Apr 16 14:52:31.594041 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.594022 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:31.694646 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.694592 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:31.795116 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.795098 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:31.895813 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.895781 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:31.916078 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.916059 2582 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:31.916201 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.916183 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:31.916240 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:31.916223 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:31.996850 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:31.996810 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-101.ec2.internal\" not found" Apr 16 14:52:32.001970 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.001940 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:30 +0000 UTC" deadline="2027-09-26 11:10:49.608719112 +0000 UTC" Apr 16 14:52:32.001970 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.001968 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12668h18m17.606753701s" Apr 16 14:52:32.010396 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.010380 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:32.017598 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.017579 2582 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:32.024814 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.024796 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:32.055044 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.055015 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dvzvl" Apr 16 14:52:32.061256 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.061239 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dvzvl" Apr 16 14:52:32.081889 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:32.081853 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57029cb5a3dfbbc5e89140d02a0a24ef.slice/crio-5935998da93367032a8b776cd59fba413008b1ab581b427fddb59776c4c53557 WatchSource:0}: Error finding container 5935998da93367032a8b776cd59fba413008b1ab581b427fddb59776c4c53557: Status 404 returned error can't find the container with id 5935998da93367032a8b776cd59fba413008b1ab581b427fddb59776c4c53557 Apr 16 14:52:32.082840 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:32.082798 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43fc43a7ac83103e37932dc8a457fdd4.slice/crio-82b982db64f8d4e47c1477a2d4b8c9f262b482fe3c14992acde54b7949ba1ce4 WatchSource:0}: Error finding container 82b982db64f8d4e47c1477a2d4b8c9f262b482fe3c14992acde54b7949ba1ce4: Status 404 returned error can't find the container with id 82b982db64f8d4e47c1477a2d4b8c9f262b482fe3c14992acde54b7949ba1ce4 Apr 16 14:52:32.087272 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.087226 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:32.111371 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.111350 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" Apr 16 14:52:32.124009 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.123991 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:32.125578 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.125565 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-101.ec2.internal" Apr 16 14:52:32.134247 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.134232 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:32.159980 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.159943 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" event={"ID":"43fc43a7ac83103e37932dc8a457fdd4","Type":"ContainerStarted","Data":"82b982db64f8d4e47c1477a2d4b8c9f262b482fe3c14992acde54b7949ba1ce4"} Apr 16 14:52:32.160881 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.160859 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-101.ec2.internal" event={"ID":"57029cb5a3dfbbc5e89140d02a0a24ef","Type":"ContainerStarted","Data":"5935998da93367032a8b776cd59fba413008b1ab581b427fddb59776c4c53557"} Apr 16 14:52:32.415239 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.415168 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:32.987494 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.987467 2582 apiserver.go:52] "Watching apiserver" Apr 16 14:52:32.998181 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.998158 2582 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:32.999115 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:32.999081 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fzdmh","kube-system/konnectivity-agent-xnxqc","kube-system/kube-apiserver-proxy-ip-10-0-139-101.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj","openshift-dns/node-resolver-z8k77","openshift-image-registry/node-ca-t4w4v","openshift-multus/multus-additional-cni-plugins-g2w5z","openshift-multus/network-metrics-daemon-6ltjv","openshift-network-diagnostics/network-check-target-zkgj2","openshift-cluster-node-tuning-operator/tuned-42wsp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal","openshift-multus/multus-dm6rx","openshift-network-operator/iptables-alerter-6xl5n"] Apr 16 14:52:33.000706 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.000684 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.002152 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.002131 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:33.003436 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.003278 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zggwx\"" Apr 16 14:52:33.003436 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.003289 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:33.003436 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.003302 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:33.003436 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.003335 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:33.003436 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.003335 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:33.003436 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.003305 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:33.004438 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.004420 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:33.004512 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.004463 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ptp4g\"" Apr 16 14:52:33.005070 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.005051 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:33.005350 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.005317 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.005465 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.005395 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-z8k77" Apr 16 14:52:33.006932 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.006910 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t4w4v" Apr 16 14:52:33.007587 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.007571 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:33.007676 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.007580 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:33.007740 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.007715 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:33.007740 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.007715 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:33.007740 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.007734 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-t52mz\"" Apr 16 14:52:33.007987 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.007813 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:33.007987 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.007856 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-6cbql\"" Apr 16 14:52:33.008553 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.008245 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.009222 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.008740 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:33.009222 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.009008 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:33.009343 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.009294 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:33.009535 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.009522 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-r2ffp\"" Apr 16 14:52:33.010135 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.010119 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:33.010228 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.010195 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:33.010749 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.010733 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:33.011153 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.010917 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:33.011153 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.010929 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:33.011153 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.010946 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:33.011153 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.011018 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:33.011153 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.011160 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:33.011675 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.011659 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tqtvc\"" Apr 16 14:52:33.012897 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.012879 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:33.012973 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.012938 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:33.014426 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.014405 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.018140 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.018105 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6795f\"" Apr 16 14:52:33.018299 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.018280 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:33.018364 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.018334 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:33.019962 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.019943 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6xl5n" Apr 16 14:52:33.020044 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.019990 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.022238 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.022217 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:33.022329 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.022220 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:33.022329 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.022221 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6s4dj\"" Apr 16 14:52:33.022529 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.022502 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:33.023191 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.023175 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2h4df\"" Apr 16 14:52:33.023278 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.023207 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:33.026590 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026571 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08f2c22e-77d0-4250-843e-95a65b09af16-ovn-node-metrics-cert\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.026692 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026605 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/16a896b7-8655-40cd-b495-e93e72c07fb6-agent-certs\") pod \"konnectivity-agent-xnxqc\" (UID: \"16a896b7-8655-40cd-b495-e93e72c07fb6\") " pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:33.026692 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026630 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7845j\" (UniqueName: \"kubernetes.io/projected/d26e8a35-54b1-4862-87ad-cab47e12e62d-kube-api-access-7845j\") pod \"node-resolver-z8k77\" (UID: \"d26e8a35-54b1-4862-87ad-cab47e12e62d\") " pod="openshift-dns/node-resolver-z8k77" Apr 16 14:52:33.026692 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026653 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-socket-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.026692 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026677 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d26e8a35-54b1-4862-87ad-cab47e12e62d-tmp-dir\") pod \"node-resolver-z8k77\" (UID: \"d26e8a35-54b1-4862-87ad-cab47e12e62d\") " pod="openshift-dns/node-resolver-z8k77" Apr 16 14:52:33.026913 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026699 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pjt8\" (UniqueName: \"kubernetes.io/projected/6157833b-8b66-48ab-a248-7d79d51cec48-kube-api-access-9pjt8\") pod \"node-ca-t4w4v\" (UID: \"6157833b-8b66-48ab-a248-7d79d51cec48\") " pod="openshift-image-registry/node-ca-t4w4v" Apr 16 14:52:33.026913 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026722 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-os-release\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.026913 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026755 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.026913 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026780 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-registration-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.026913 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026803 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-kubelet\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.026913 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026848 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-run-netns\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.026913 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026880 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-node-log\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.026913 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026903 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08f2c22e-77d0-4250-843e-95a65b09af16-ovnkube-script-lib\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026927 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026954 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-device-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.026977 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-etc-selinux\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027014 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt56z\" (UniqueName: \"kubernetes.io/projected/240ab929-6399-4d7c-b583-aef03c5cc884-kube-api-access-kt56z\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027037 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-slash\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027059 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027088 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78qq5\" (UniqueName: \"kubernetes.io/projected/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-kube-api-access-78qq5\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027118 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6157833b-8b66-48ab-a248-7d79d51cec48-serviceca\") pod \"node-ca-t4w4v\" (UID: \"6157833b-8b66-48ab-a248-7d79d51cec48\") " pod="openshift-image-registry/node-ca-t4w4v" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027150 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-sys-fs\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027176 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-var-lib-openvswitch\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027203 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42jct\" (UniqueName: \"kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct\") pod \"network-check-target-zkgj2\" (UID: \"8c2c36da-9263-4f56-8f34-dd26c0ce00c9\") " pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:33.027242 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027227 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-cnibin\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027254 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027278 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027312 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4mxg\" (UniqueName: \"kubernetes.io/projected/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-kube-api-access-l4mxg\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027343 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027367 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-run-ovn\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027389 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-log-socket\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027412 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-system-cni-dir\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027438 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-etc-openvswitch\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027461 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-run-openvswitch\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027486 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-cni-netd\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027518 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08f2c22e-77d0-4250-843e-95a65b09af16-env-overrides\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027544 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8brj\" (UniqueName: \"kubernetes.io/projected/08f2c22e-77d0-4250-843e-95a65b09af16-kube-api-access-d8brj\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027567 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-run-ovn-kubernetes\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027595 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-cni-bin\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027629 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d26e8a35-54b1-4862-87ad-cab47e12e62d-hosts-file\") pod \"node-resolver-z8k77\" (UID: \"d26e8a35-54b1-4862-87ad-cab47e12e62d\") " pod="openshift-dns/node-resolver-z8k77" Apr 16 14:52:33.027721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027655 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-systemd-units\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.028446 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027679 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-run-systemd\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.028446 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027704 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/16a896b7-8655-40cd-b495-e93e72c07fb6-konnectivity-ca\") pod \"konnectivity-agent-xnxqc\" (UID: \"16a896b7-8655-40cd-b495-e93e72c07fb6\") " pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:33.028446 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027729 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6157833b-8b66-48ab-a248-7d79d51cec48-host\") pod \"node-ca-t4w4v\" (UID: \"6157833b-8b66-48ab-a248-7d79d51cec48\") " pod="openshift-image-registry/node-ca-t4w4v" Apr 16 14:52:33.028446 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027760 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.028446 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.027787 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08f2c22e-77d0-4250-843e-95a65b09af16-ovnkube-config\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.061917 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.061887 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:32 +0000 UTC" deadline="2028-02-01 19:39:24.811701132 +0000 UTC" Apr 16 14:52:33.062012 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.061917 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15748h46m51.749788713s" Apr 16 14:52:33.112402 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.112377 2582 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:33.128789 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.128756 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-etc-openvswitch\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.128936 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.128794 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d26e8a35-54b1-4862-87ad-cab47e12e62d-hosts-file\") pod \"node-resolver-z8k77\" (UID: \"d26e8a35-54b1-4862-87ad-cab47e12e62d\") " pod="openshift-dns/node-resolver-z8k77" Apr 16 14:52:33.128936 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.128835 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-systemd-units\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.128936 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.128874 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-etc-openvswitch\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.128936 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.128884 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-systemd-units\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.128936 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.128909 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d26e8a35-54b1-4862-87ad-cab47e12e62d-hosts-file\") pod \"node-resolver-z8k77\" (UID: \"d26e8a35-54b1-4862-87ad-cab47e12e62d\") " pod="openshift-dns/node-resolver-z8k77" Apr 16 14:52:33.129187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.128958 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-run-systemd\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.129187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.128996 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/16a896b7-8655-40cd-b495-e93e72c07fb6-konnectivity-ca\") pod \"konnectivity-agent-xnxqc\" (UID: \"16a896b7-8655-40cd-b495-e93e72c07fb6\") " pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:33.129187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.128998 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-run-systemd\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.129187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129022 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6157833b-8b66-48ab-a248-7d79d51cec48-host\") pod \"node-ca-t4w4v\" (UID: \"6157833b-8b66-48ab-a248-7d79d51cec48\") " pod="openshift-image-registry/node-ca-t4w4v" Apr 16 14:52:33.129187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129052 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-cnibin\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.129187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129068 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-run-multus-certs\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.129187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129093 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.129187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129112 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08f2c22e-77d0-4250-843e-95a65b09af16-ovn-node-metrics-cert\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.129187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129137 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6157833b-8b66-48ab-a248-7d79d51cec48-host\") pod \"node-ca-t4w4v\" (UID: \"6157833b-8b66-48ab-a248-7d79d51cec48\") " pod="openshift-image-registry/node-ca-t4w4v" Apr 16 14:52:33.129187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129155 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129193 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/16a896b7-8655-40cd-b495-e93e72c07fb6-agent-certs\") pod \"konnectivity-agent-xnxqc\" (UID: \"16a896b7-8655-40cd-b495-e93e72c07fb6\") " pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129222 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-sysconfig\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129249 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-cni-dir\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129282 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e4406ba0-59cd-4412-bdfe-3284d83e48a7-host-slash\") pod \"iptables-alerter-6xl5n\" (UID: \"e4406ba0-59cd-4412-bdfe-3284d83e48a7\") " pod="openshift-network-operator/iptables-alerter-6xl5n" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129315 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pjt8\" (UniqueName: \"kubernetes.io/projected/6157833b-8b66-48ab-a248-7d79d51cec48-kube-api-access-9pjt8\") pod \"node-ca-t4w4v\" (UID: \"6157833b-8b66-48ab-a248-7d79d51cec48\") " pod="openshift-image-registry/node-ca-t4w4v" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129383 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129422 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-system-cni-dir\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129450 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-run-netns\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129458 2582 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129481 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-tmp\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129519 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-run-netns\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129528 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx4m8\" (UniqueName: \"kubernetes.io/projected/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-kube-api-access-mx4m8\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129570 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0e9dc65-caa5-437c-b911-96a930ff75fe-cni-binary-copy\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129599 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5trjb\" (UniqueName: \"kubernetes.io/projected/e0e9dc65-caa5-437c-b911-96a930ff75fe-kube-api-access-5trjb\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.129621 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129628 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-device-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129649 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt56z\" (UniqueName: \"kubernetes.io/projected/240ab929-6399-4d7c-b583-aef03c5cc884-kube-api-access-kt56z\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129668 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78qq5\" (UniqueName: \"kubernetes.io/projected/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-kube-api-access-78qq5\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129681 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-device-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129718 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6157833b-8b66-48ab-a248-7d79d51cec48-serviceca\") pod \"node-ca-t4w4v\" (UID: \"6157833b-8b66-48ab-a248-7d79d51cec48\") " pod="openshift-image-registry/node-ca-t4w4v" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129778 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42jct\" (UniqueName: \"kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct\") pod \"network-check-target-zkgj2\" (UID: \"8c2c36da-9263-4f56-8f34-dd26c0ce00c9\") " pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129807 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-var-lib-cni-bin\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129850 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-var-lib-cni-multus\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129877 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7vmd\" (UniqueName: \"kubernetes.io/projected/e4406ba0-59cd-4412-bdfe-3284d83e48a7-kube-api-access-j7vmd\") pod \"iptables-alerter-6xl5n\" (UID: \"e4406ba0-59cd-4412-bdfe-3284d83e48a7\") " pod="openshift-network-operator/iptables-alerter-6xl5n" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129902 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129926 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-sysctl-conf\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129947 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-systemd\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129973 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.129997 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-run-ovn\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130021 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-system-cni-dir\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130047 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-var-lib-kubelet\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.130330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130072 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-etc-kubernetes\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130098 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e4406ba0-59cd-4412-bdfe-3284d83e48a7-iptables-alerter-script\") pod \"iptables-alerter-6xl5n\" (UID: \"e4406ba0-59cd-4412-bdfe-3284d83e48a7\") " pod="openshift-network-operator/iptables-alerter-6xl5n" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130127 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-run-openvswitch\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130152 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-cni-netd\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130093 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130176 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130185 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/16a896b7-8655-40cd-b495-e93e72c07fb6-konnectivity-ca\") pod \"konnectivity-agent-xnxqc\" (UID: \"16a896b7-8655-40cd-b495-e93e72c07fb6\") " pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130178 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08f2c22e-77d0-4250-843e-95a65b09af16-env-overrides\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130150 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-system-cni-dir\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130226 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6157833b-8b66-48ab-a248-7d79d51cec48-serviceca\") pod \"node-ca-t4w4v\" (UID: \"6157833b-8b66-48ab-a248-7d79d51cec48\") " pod="openshift-image-registry/node-ca-t4w4v" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130265 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-cni-netd\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130279 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-run-openvswitch\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130233 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-run-ovn\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130300 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8brj\" (UniqueName: \"kubernetes.io/projected/08f2c22e-77d0-4250-843e-95a65b09af16-kube-api-access-d8brj\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130333 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-modprobe-d\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130405 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-sysctl-d\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130431 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-sys\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.131025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130460 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-os-release\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130477 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-run-ovn-kubernetes\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130499 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-run-ovn-kubernetes\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130512 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-cni-bin\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130558 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-cni-bin\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130593 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-host\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130621 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-socket-dir-parent\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130643 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08f2c22e-77d0-4250-843e-95a65b09af16-env-overrides\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130646 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-daemon-config\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130685 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08f2c22e-77d0-4250-843e-95a65b09af16-ovnkube-config\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130687 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130711 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-run-k8s-cni-cncf-io\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130730 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7845j\" (UniqueName: \"kubernetes.io/projected/d26e8a35-54b1-4862-87ad-cab47e12e62d-kube-api-access-7845j\") pod \"node-resolver-z8k77\" (UID: \"d26e8a35-54b1-4862-87ad-cab47e12e62d\") " pod="openshift-dns/node-resolver-z8k77" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130745 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-socket-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130760 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d26e8a35-54b1-4862-87ad-cab47e12e62d-tmp-dir\") pod \"node-resolver-z8k77\" (UID: \"d26e8a35-54b1-4862-87ad-cab47e12e62d\") " pod="openshift-dns/node-resolver-z8k77" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130781 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-os-release\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130803 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-registration-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.131838 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130844 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-kubelet\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130867 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-node-log\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130887 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08f2c22e-77d0-4250-843e-95a65b09af16-ovnkube-script-lib\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130904 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130921 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-kubernetes\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130941 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-etc-selinux\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130959 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-slash\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.130980 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131007 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131010 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-node-log\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131041 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-hostroot\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131072 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-conf-dir\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131092 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-sys-fs\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131113 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-var-lib-openvswitch\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131132 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-cnibin\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131148 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4mxg\" (UniqueName: \"kubernetes.io/projected/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-kube-api-access-l4mxg\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131163 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-run\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.132456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131184 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-lib-modules\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131198 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-tuned\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131212 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-run-netns\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131227 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-log-socket\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131243 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-var-lib-kubelet\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131320 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d26e8a35-54b1-4862-87ad-cab47e12e62d-tmp-dir\") pod \"node-resolver-z8k77\" (UID: \"d26e8a35-54b1-4862-87ad-cab47e12e62d\") " pod="openshift-dns/node-resolver-z8k77" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131388 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-socket-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131412 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-os-release\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131578 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-registration-dir\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131599 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08f2c22e-77d0-4250-843e-95a65b09af16-ovnkube-config\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.131694 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131719 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-etc-selinux\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131720 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-slash\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131744 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08f2c22e-77d0-4250-843e-95a65b09af16-ovnkube-script-lib\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.131776 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs podName:3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e nodeName:}" failed. No retries permitted until 2026-04-16 14:52:33.63174677 +0000 UTC m=+3.128074001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs") pod "network-metrics-daemon-6ltjv" (UID: "3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131867 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/240ab929-6399-4d7c-b583-aef03c5cc884-sys-fs\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131907 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-var-lib-openvswitch\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.133229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131950 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-cnibin\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.133981 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.131967 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.133981 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.132015 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-host-kubelet\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.133981 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.132054 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08f2c22e-77d0-4250-843e-95a65b09af16-log-socket\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.133981 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.132198 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.133981 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.133502 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08f2c22e-77d0-4250-843e-95a65b09af16-ovn-node-metrics-cert\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.133981 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.133550 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/16a896b7-8655-40cd-b495-e93e72c07fb6-agent-certs\") pod \"konnectivity-agent-xnxqc\" (UID: \"16a896b7-8655-40cd-b495-e93e72c07fb6\") " pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:33.145121 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.145095 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:33.145396 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.145125 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:33.145396 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.145139 2582 projected.go:194] Error preparing data for projected volume kube-api-access-42jct for pod openshift-network-diagnostics/network-check-target-zkgj2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:33.145396 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.145208 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct podName:8c2c36da-9263-4f56-8f34-dd26c0ce00c9 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:33.645190617 +0000 UTC m=+3.141517826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-42jct" (UniqueName: "kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct") pod "network-check-target-zkgj2" (UID: "8c2c36da-9263-4f56-8f34-dd26c0ce00c9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:33.147495 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.147420 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pjt8\" (UniqueName: \"kubernetes.io/projected/6157833b-8b66-48ab-a248-7d79d51cec48-kube-api-access-9pjt8\") pod \"node-ca-t4w4v\" (UID: \"6157833b-8b66-48ab-a248-7d79d51cec48\") " pod="openshift-image-registry/node-ca-t4w4v" Apr 16 14:52:33.147495 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.147486 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4mxg\" (UniqueName: \"kubernetes.io/projected/842ebc47-2cf2-4818-ab9a-d2f17f00f1da-kube-api-access-l4mxg\") pod \"multus-additional-cni-plugins-g2w5z\" (UID: \"842ebc47-2cf2-4818-ab9a-d2f17f00f1da\") " pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.147675 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.147611 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8brj\" (UniqueName: \"kubernetes.io/projected/08f2c22e-77d0-4250-843e-95a65b09af16-kube-api-access-d8brj\") pod \"ovnkube-node-fzdmh\" (UID: \"08f2c22e-77d0-4250-843e-95a65b09af16\") " pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.148653 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.148583 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt56z\" (UniqueName: \"kubernetes.io/projected/240ab929-6399-4d7c-b583-aef03c5cc884-kube-api-access-kt56z\") pod \"aws-ebs-csi-driver-node-7jxtj\" (UID: \"240ab929-6399-4d7c-b583-aef03c5cc884\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.148907 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.148885 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7845j\" (UniqueName: \"kubernetes.io/projected/d26e8a35-54b1-4862-87ad-cab47e12e62d-kube-api-access-7845j\") pod \"node-resolver-z8k77\" (UID: \"d26e8a35-54b1-4862-87ad-cab47e12e62d\") " pod="openshift-dns/node-resolver-z8k77" Apr 16 14:52:33.149230 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.149214 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78qq5\" (UniqueName: \"kubernetes.io/projected/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-kube-api-access-78qq5\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:33.232187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232164 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-modprobe-d\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232192 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-sysctl-d\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232215 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-sys\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232253 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-os-release\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232278 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-host\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232302 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-socket-dir-parent\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232317 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-sys\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232327 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-daemon-config\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232362 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-run-k8s-cni-cncf-io\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232377 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-sysctl-d\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232395 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-kubernetes\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232378 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-modprobe-d\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232395 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-host\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232398 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-socket-dir-parent\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232431 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-run-k8s-cni-cncf-io\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232440 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-os-release\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232450 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-kubernetes\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232537 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-hostroot\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232563 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-conf-dir\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232596 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-run\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232619 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-lib-modules\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232651 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-tuned\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232674 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-run-netns\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232690 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-run\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232699 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-var-lib-kubelet\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232731 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-cnibin\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232737 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-conf-dir\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232650 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-hostroot\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.232796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232755 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-run-multus-certs\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232792 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-sysconfig\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232814 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-cni-dir\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232857 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e4406ba0-59cd-4412-bdfe-3284d83e48a7-host-slash\") pod \"iptables-alerter-6xl5n\" (UID: \"e4406ba0-59cd-4412-bdfe-3284d83e48a7\") " pod="openshift-network-operator/iptables-alerter-6xl5n" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232860 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-lib-modules\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232877 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-daemon-config\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232883 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-system-cni-dir\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232908 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-tmp\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232923 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-run-multus-certs\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232932 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx4m8\" (UniqueName: \"kubernetes.io/projected/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-kube-api-access-mx4m8\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232957 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-run-netns\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.232960 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0e9dc65-caa5-437c-b911-96a930ff75fe-cni-binary-copy\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233024 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5trjb\" (UniqueName: \"kubernetes.io/projected/e0e9dc65-caa5-437c-b911-96a930ff75fe-kube-api-access-5trjb\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233070 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-var-lib-cni-bin\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233084 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-var-lib-kubelet\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233095 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-var-lib-cni-multus\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233098 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-cnibin\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233117 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-multus-cni-dir\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.233597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233122 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7vmd\" (UniqueName: \"kubernetes.io/projected/e4406ba0-59cd-4412-bdfe-3284d83e48a7-kube-api-access-j7vmd\") pod \"iptables-alerter-6xl5n\" (UID: \"e4406ba0-59cd-4412-bdfe-3284d83e48a7\") " pod="openshift-network-operator/iptables-alerter-6xl5n" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233131 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e4406ba0-59cd-4412-bdfe-3284d83e48a7-host-slash\") pod \"iptables-alerter-6xl5n\" (UID: \"e4406ba0-59cd-4412-bdfe-3284d83e48a7\") " pod="openshift-network-operator/iptables-alerter-6xl5n" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233160 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-var-lib-cni-multus\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233097 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-sysconfig\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233159 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-sysctl-conf\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233205 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-var-lib-cni-bin\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233228 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-systemd\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233274 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-systemd\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233277 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-system-cni-dir\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233314 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-var-lib-kubelet\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233353 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-host-var-lib-kubelet\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233382 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-etc-kubernetes\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233393 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-sysctl-conf\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233407 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e4406ba0-59cd-4412-bdfe-3284d83e48a7-iptables-alerter-script\") pod \"iptables-alerter-6xl5n\" (UID: \"e4406ba0-59cd-4412-bdfe-3284d83e48a7\") " pod="openshift-network-operator/iptables-alerter-6xl5n" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233410 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0e9dc65-caa5-437c-b911-96a930ff75fe-cni-binary-copy\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.233432 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0e9dc65-caa5-437c-b911-96a930ff75fe-etc-kubernetes\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.234252 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.234230 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e4406ba0-59cd-4412-bdfe-3284d83e48a7-iptables-alerter-script\") pod \"iptables-alerter-6xl5n\" (UID: \"e4406ba0-59cd-4412-bdfe-3284d83e48a7\") " pod="openshift-network-operator/iptables-alerter-6xl5n" Apr 16 14:52:33.235349 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.235325 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-etc-tuned\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.235456 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.235367 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-tmp\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.241102 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.241042 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx4m8\" (UniqueName: \"kubernetes.io/projected/e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea-kube-api-access-mx4m8\") pod \"tuned-42wsp\" (UID: \"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea\") " pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.241670 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.241653 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5trjb\" (UniqueName: \"kubernetes.io/projected/e0e9dc65-caa5-437c-b911-96a930ff75fe-kube-api-access-5trjb\") pod \"multus-dm6rx\" (UID: \"e0e9dc65-caa5-437c-b911-96a930ff75fe\") " pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.241752 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.241728 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7vmd\" (UniqueName: \"kubernetes.io/projected/e4406ba0-59cd-4412-bdfe-3284d83e48a7-kube-api-access-j7vmd\") pod \"iptables-alerter-6xl5n\" (UID: \"e4406ba0-59cd-4412-bdfe-3284d83e48a7\") " pod="openshift-network-operator/iptables-alerter-6xl5n" Apr 16 14:52:33.287803 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.287779 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:33.312903 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.312879 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g2w5z" Apr 16 14:52:33.321496 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.321476 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:33.331119 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.331100 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" Apr 16 14:52:33.335680 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.335665 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-z8k77" Apr 16 14:52:33.341169 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.341150 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t4w4v" Apr 16 14:52:33.347744 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.347727 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:33.353279 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.353261 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-42wsp" Apr 16 14:52:33.359850 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.359816 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dm6rx" Apr 16 14:52:33.365465 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.365443 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6xl5n" Apr 16 14:52:33.462735 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.462698 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:33.635877 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.635782 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:33.636031 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.635921 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:33.636031 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.635976 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs podName:3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e nodeName:}" failed. No retries permitted until 2026-04-16 14:52:34.63596157 +0000 UTC m=+4.132288778 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs") pod "network-metrics-daemon-6ltjv" (UID: "3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:33.684321 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:33.684238 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod240ab929_6399_4d7c_b583_aef03c5cc884.slice/crio-362f64c8876723fa991a166226cc838e536ec3d2f77784dcf5e8adff2993b19c WatchSource:0}: Error finding container 362f64c8876723fa991a166226cc838e536ec3d2f77784dcf5e8adff2993b19c: Status 404 returned error can't find the container with id 362f64c8876723fa991a166226cc838e536ec3d2f77784dcf5e8adff2993b19c Apr 16 14:52:33.686802 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:33.686779 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6157833b_8b66_48ab_a248_7d79d51cec48.slice/crio-88c0859c65f147d2aafcecbd287ce058763f16b96859a03c7b9332d82d3835ae WatchSource:0}: Error finding container 88c0859c65f147d2aafcecbd287ce058763f16b96859a03c7b9332d82d3835ae: Status 404 returned error can't find the container with id 88c0859c65f147d2aafcecbd287ce058763f16b96859a03c7b9332d82d3835ae Apr 16 14:52:33.688509 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:33.688442 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4406ba0_59cd_4412_bdfe_3284d83e48a7.slice/crio-c4be48fc462b50cfcc677cdecc3d5e58f3cf357402000923d33bad88e05c3c75 WatchSource:0}: Error finding container c4be48fc462b50cfcc677cdecc3d5e58f3cf357402000923d33bad88e05c3c75: Status 404 returned error can't find the container with id c4be48fc462b50cfcc677cdecc3d5e58f3cf357402000923d33bad88e05c3c75 Apr 16 14:52:33.689938 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:33.689744 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e9dc65_caa5_437c_b911_96a930ff75fe.slice/crio-c5adbeec1b6f39a6b5a247ee10c706eaae258455326af213980684dbcf87b6da WatchSource:0}: Error finding container c5adbeec1b6f39a6b5a247ee10c706eaae258455326af213980684dbcf87b6da: Status 404 returned error can't find the container with id c5adbeec1b6f39a6b5a247ee10c706eaae258455326af213980684dbcf87b6da Apr 16 14:52:33.690570 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:33.690547 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26e8a35_54b1_4862_87ad_cab47e12e62d.slice/crio-0135fa12142bd4e9880d990ffbf54a263270f1428c51839e41c4621f532783f3 WatchSource:0}: Error finding container 0135fa12142bd4e9880d990ffbf54a263270f1428c51839e41c4621f532783f3: Status 404 returned error can't find the container with id 0135fa12142bd4e9880d990ffbf54a263270f1428c51839e41c4621f532783f3 Apr 16 14:52:33.691634 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:33.691430 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08f2c22e_77d0_4250_843e_95a65b09af16.slice/crio-3ebbd6e090b7f39c3a1ec3ea5260de1dd11dfe791b221f85214ef1a98ac99c94 WatchSource:0}: Error finding container 3ebbd6e090b7f39c3a1ec3ea5260de1dd11dfe791b221f85214ef1a98ac99c94: Status 404 returned error can't find the container with id 3ebbd6e090b7f39c3a1ec3ea5260de1dd11dfe791b221f85214ef1a98ac99c94 Apr 16 14:52:33.692339 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:33.692315 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0b17a8e_2ea7_490f_8ad4_c7ab6d453eea.slice/crio-f7466a55737f50d15f1e9e5262493105e82305117d3b9394406241bae7bfe4eb WatchSource:0}: Error finding container f7466a55737f50d15f1e9e5262493105e82305117d3b9394406241bae7bfe4eb: Status 404 returned error can't find the container with id f7466a55737f50d15f1e9e5262493105e82305117d3b9394406241bae7bfe4eb Apr 16 14:52:33.694367 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:33.694332 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16a896b7_8655_40cd_b495_e93e72c07fb6.slice/crio-5065da57cb2427c55ffd1d0aaa400f78abab9abaa83bf916868f69a34bc130a0 WatchSource:0}: Error finding container 5065da57cb2427c55ffd1d0aaa400f78abab9abaa83bf916868f69a34bc130a0: Status 404 returned error can't find the container with id 5065da57cb2427c55ffd1d0aaa400f78abab9abaa83bf916868f69a34bc130a0 Apr 16 14:52:33.695309 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:52:33.695271 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842ebc47_2cf2_4818_ab9a_d2f17f00f1da.slice/crio-fe901b17177a4341c74b3f29c2c0589d96433fab42d0c81b13ceb70a63aa61f9 WatchSource:0}: Error finding container fe901b17177a4341c74b3f29c2c0589d96433fab42d0c81b13ceb70a63aa61f9: Status 404 returned error can't find the container with id fe901b17177a4341c74b3f29c2c0589d96433fab42d0c81b13ceb70a63aa61f9 Apr 16 14:52:33.736947 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:33.736923 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42jct\" (UniqueName: \"kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct\") pod \"network-check-target-zkgj2\" (UID: \"8c2c36da-9263-4f56-8f34-dd26c0ce00c9\") " pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:33.737057 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.737044 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:33.737117 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.737062 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:33.737117 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.737073 2582 projected.go:194] Error preparing data for projected volume kube-api-access-42jct for pod openshift-network-diagnostics/network-check-target-zkgj2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:33.737211 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:33.737129 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct podName:8c2c36da-9263-4f56-8f34-dd26c0ce00c9 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:34.737110552 +0000 UTC m=+4.233437772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-42jct" (UniqueName: "kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct") pod "network-check-target-zkgj2" (UID: "8c2c36da-9263-4f56-8f34-dd26c0ce00c9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:34.062632 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.062440 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:32 +0000 UTC" deadline="2027-09-29 03:09:34.729692689 +0000 UTC" Apr 16 14:52:34.062632 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.062555 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12732h17m0.667142667s" Apr 16 14:52:34.180149 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.180110 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2w5z" event={"ID":"842ebc47-2cf2-4818-ab9a-d2f17f00f1da","Type":"ContainerStarted","Data":"fe901b17177a4341c74b3f29c2c0589d96433fab42d0c81b13ceb70a63aa61f9"} Apr 16 14:52:34.192170 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.192058 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xnxqc" event={"ID":"16a896b7-8655-40cd-b495-e93e72c07fb6","Type":"ContainerStarted","Data":"5065da57cb2427c55ffd1d0aaa400f78abab9abaa83bf916868f69a34bc130a0"} Apr 16 14:52:34.194863 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.194818 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-42wsp" event={"ID":"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea","Type":"ContainerStarted","Data":"f7466a55737f50d15f1e9e5262493105e82305117d3b9394406241bae7bfe4eb"} Apr 16 14:52:34.199109 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.199083 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dm6rx" event={"ID":"e0e9dc65-caa5-437c-b911-96a930ff75fe","Type":"ContainerStarted","Data":"c5adbeec1b6f39a6b5a247ee10c706eaae258455326af213980684dbcf87b6da"} Apr 16 14:52:34.211472 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.211443 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6xl5n" event={"ID":"e4406ba0-59cd-4412-bdfe-3284d83e48a7","Type":"ContainerStarted","Data":"c4be48fc462b50cfcc677cdecc3d5e58f3cf357402000923d33bad88e05c3c75"} Apr 16 14:52:34.221366 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.221310 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" event={"ID":"240ab929-6399-4d7c-b583-aef03c5cc884","Type":"ContainerStarted","Data":"362f64c8876723fa991a166226cc838e536ec3d2f77784dcf5e8adff2993b19c"} Apr 16 14:52:34.233340 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.232624 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-101.ec2.internal" event={"ID":"57029cb5a3dfbbc5e89140d02a0a24ef","Type":"ContainerStarted","Data":"3724f0fef103efc2638532f7c77e3d847dba95179029753665ae50572a50c2fe"} Apr 16 14:52:34.235458 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.235431 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" event={"ID":"08f2c22e-77d0-4250-843e-95a65b09af16","Type":"ContainerStarted","Data":"3ebbd6e090b7f39c3a1ec3ea5260de1dd11dfe791b221f85214ef1a98ac99c94"} Apr 16 14:52:34.247431 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.246852 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-101.ec2.internal" podStartSLOduration=2.246817834 podStartE2EDuration="2.246817834s" podCreationTimestamp="2026-04-16 14:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:34.246097238 +0000 UTC m=+3.742424469" watchObservedRunningTime="2026-04-16 14:52:34.246817834 +0000 UTC m=+3.743145064" Apr 16 14:52:34.250492 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.250468 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-z8k77" event={"ID":"d26e8a35-54b1-4862-87ad-cab47e12e62d","Type":"ContainerStarted","Data":"0135fa12142bd4e9880d990ffbf54a263270f1428c51839e41c4621f532783f3"} Apr 16 14:52:34.255903 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.255880 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t4w4v" event={"ID":"6157833b-8b66-48ab-a248-7d79d51cec48","Type":"ContainerStarted","Data":"88c0859c65f147d2aafcecbd287ce058763f16b96859a03c7b9332d82d3835ae"} Apr 16 14:52:34.645020 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.644987 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:34.645246 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:34.645224 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:34.645316 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:34.645298 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs podName:3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e nodeName:}" failed. No retries permitted until 2026-04-16 14:52:36.645278522 +0000 UTC m=+6.141605735 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs") pod "network-metrics-daemon-6ltjv" (UID: "3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:34.745605 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:34.745576 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42jct\" (UniqueName: \"kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct\") pod \"network-check-target-zkgj2\" (UID: \"8c2c36da-9263-4f56-8f34-dd26c0ce00c9\") " pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:34.745727 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:34.745717 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:34.745788 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:34.745735 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:34.745788 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:34.745747 2582 projected.go:194] Error preparing data for projected volume kube-api-access-42jct for pod openshift-network-diagnostics/network-check-target-zkgj2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:34.746002 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:34.745799 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct podName:8c2c36da-9263-4f56-8f34-dd26c0ce00c9 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:36.745779822 +0000 UTC m=+6.242107043 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-42jct" (UniqueName: "kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct") pod "network-check-target-zkgj2" (UID: "8c2c36da-9263-4f56-8f34-dd26c0ce00c9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:35.159000 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:35.158971 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:35.159433 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:35.159098 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:35.159433 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:35.159146 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:35.159433 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:35.159276 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:35.293369 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:35.293335 2582 generic.go:358] "Generic (PLEG): container finished" podID="43fc43a7ac83103e37932dc8a457fdd4" containerID="ede77f37579b115f9514f4877e70ddcaf73c62a0b5c42fac0c26f64bb9a9896b" exitCode=0 Apr 16 14:52:35.294330 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:35.294276 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" event={"ID":"43fc43a7ac83103e37932dc8a457fdd4","Type":"ContainerDied","Data":"ede77f37579b115f9514f4877e70ddcaf73c62a0b5c42fac0c26f64bb9a9896b"} Apr 16 14:52:36.300070 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:36.300031 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" event={"ID":"43fc43a7ac83103e37932dc8a457fdd4","Type":"ContainerStarted","Data":"1fb520bfc48cfccf0581778f81a19c8895c6c8e57b03fd4c465eecf0cbe9fc4e"} Apr 16 14:52:36.660756 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:36.660212 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:36.660756 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:36.660334 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:36.660756 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:36.660386 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs podName:3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e nodeName:}" failed. No retries permitted until 2026-04-16 14:52:40.660369573 +0000 UTC m=+10.156696787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs") pod "network-metrics-daemon-6ltjv" (UID: "3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:36.761100 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:36.760524 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42jct\" (UniqueName: \"kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct\") pod \"network-check-target-zkgj2\" (UID: \"8c2c36da-9263-4f56-8f34-dd26c0ce00c9\") " pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:36.761100 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:36.760693 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:36.761100 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:36.760715 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:36.761100 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:36.760728 2582 projected.go:194] Error preparing data for projected volume kube-api-access-42jct for pod openshift-network-diagnostics/network-check-target-zkgj2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:36.761100 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:36.760786 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct podName:8c2c36da-9263-4f56-8f34-dd26c0ce00c9 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:40.760767593 +0000 UTC m=+10.257094809 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-42jct" (UniqueName: "kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct") pod "network-check-target-zkgj2" (UID: "8c2c36da-9263-4f56-8f34-dd26c0ce00c9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:37.159040 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:37.158962 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:37.159182 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:37.159094 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:37.159256 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:37.159188 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:37.159340 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:37.159309 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:39.158149 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:39.158098 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:39.158558 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:39.158248 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:39.158614 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:39.158098 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:39.158681 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:39.158655 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:40.692740 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:40.692692 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:40.693149 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:40.692878 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:40.693149 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:40.692941 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs podName:3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.692920941 +0000 UTC m=+18.189248158 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs") pod "network-metrics-daemon-6ltjv" (UID: "3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:40.793874 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:40.793814 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42jct\" (UniqueName: \"kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct\") pod \"network-check-target-zkgj2\" (UID: \"8c2c36da-9263-4f56-8f34-dd26c0ce00c9\") " pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:40.794024 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:40.794009 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:40.794116 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:40.794031 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:40.794116 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:40.794043 2582 projected.go:194] Error preparing data for projected volume kube-api-access-42jct for pod openshift-network-diagnostics/network-check-target-zkgj2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:40.794116 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:40.794097 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct podName:8c2c36da-9263-4f56-8f34-dd26c0ce00c9 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.794078468 +0000 UTC m=+18.290405675 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-42jct" (UniqueName: "kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct") pod "network-check-target-zkgj2" (UID: "8c2c36da-9263-4f56-8f34-dd26c0ce00c9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:41.159008 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:41.158938 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:41.159913 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:41.159890 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:41.160023 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:41.160000 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:41.160123 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:41.160093 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:43.160815 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:43.160734 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:43.161259 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:43.160877 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:43.161327 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:43.161297 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:43.161409 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:43.161389 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:45.158150 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:45.158115 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:45.158652 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:45.158162 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:45.158652 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:45.158252 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:45.158652 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:45.158390 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:46.488345 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:46.488298 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-101.ec2.internal" podStartSLOduration=14.48828232 podStartE2EDuration="14.48828232s" podCreationTimestamp="2026-04-16 14:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:36.312723151 +0000 UTC m=+5.809050381" watchObservedRunningTime="2026-04-16 14:52:46.48828232 +0000 UTC m=+15.984609550" Apr 16 14:52:46.488753 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:46.488531 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nz48r"] Apr 16 14:52:46.490710 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:46.490691 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:46.491880 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:46.491109 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nz48r" podUID="ada98a0b-3e12-4c80-9534-e61848c60c06" Apr 16 14:52:46.540720 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:46.540652 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ada98a0b-3e12-4c80-9534-e61848c60c06-kubelet-config\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:46.540850 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:46.540723 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:46.540850 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:46.540795 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ada98a0b-3e12-4c80-9534-e61848c60c06-dbus\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:46.641936 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:46.641900 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ada98a0b-3e12-4c80-9534-e61848c60c06-dbus\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:46.642105 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:46.641983 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ada98a0b-3e12-4c80-9534-e61848c60c06-kubelet-config\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:46.642105 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:46.642019 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:46.642218 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:46.642096 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ada98a0b-3e12-4c80-9534-e61848c60c06-dbus\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:46.642218 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:46.642128 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:46.642218 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:46.642170 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ada98a0b-3e12-4c80-9534-e61848c60c06-kubelet-config\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:46.642218 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:46.642176 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret podName:ada98a0b-3e12-4c80-9534-e61848c60c06 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.142162055 +0000 UTC m=+16.638489262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret") pod "global-pull-secret-syncer-nz48r" (UID: "ada98a0b-3e12-4c80-9534-e61848c60c06") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:47.145984 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:47.145952 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:47.146150 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:47.146118 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:47.146209 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:47.146197 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret podName:ada98a0b-3e12-4c80-9534-e61848c60c06 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.146174466 +0000 UTC m=+17.642501726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret") pod "global-pull-secret-syncer-nz48r" (UID: "ada98a0b-3e12-4c80-9534-e61848c60c06") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:47.158221 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:47.158192 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:47.158354 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:47.158330 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:47.158424 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:47.158390 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:47.158498 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:47.158478 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:48.153797 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:48.153763 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:48.154234 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:48.153882 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:48.154234 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:48.153931 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret podName:ada98a0b-3e12-4c80-9534-e61848c60c06 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:50.153918029 +0000 UTC m=+19.650245237 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret") pod "global-pull-secret-syncer-nz48r" (UID: "ada98a0b-3e12-4c80-9534-e61848c60c06") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:48.157775 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:48.157750 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:48.157901 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:48.157878 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nz48r" podUID="ada98a0b-3e12-4c80-9534-e61848c60c06" Apr 16 14:52:48.758842 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:48.758794 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:48.759004 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:48.758954 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:48.759085 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:48.759022 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs podName:3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.759004092 +0000 UTC m=+34.255331299 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs") pod "network-metrics-daemon-6ltjv" (UID: "3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:48.859302 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:48.859269 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42jct\" (UniqueName: \"kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct\") pod \"network-check-target-zkgj2\" (UID: \"8c2c36da-9263-4f56-8f34-dd26c0ce00c9\") " pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:48.859476 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:48.859455 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:48.859525 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:48.859483 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:48.859525 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:48.859498 2582 projected.go:194] Error preparing data for projected volume kube-api-access-42jct for pod openshift-network-diagnostics/network-check-target-zkgj2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:48.859612 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:48.859561 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct podName:8c2c36da-9263-4f56-8f34-dd26c0ce00c9 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.859541879 +0000 UTC m=+34.355869102 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-42jct" (UniqueName: "kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct") pod "network-check-target-zkgj2" (UID: "8c2c36da-9263-4f56-8f34-dd26c0ce00c9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:49.158293 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:49.158207 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:49.158293 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:49.158241 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:49.158778 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:49.158330 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:49.158778 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:49.158415 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:50.157552 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:50.157518 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:50.157814 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:50.157630 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nz48r" podUID="ada98a0b-3e12-4c80-9534-e61848c60c06" Apr 16 14:52:50.166525 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:50.166499 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:50.166912 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:50.166625 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:50.166912 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:50.166680 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret podName:ada98a0b-3e12-4c80-9534-e61848c60c06 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:54.166662977 +0000 UTC m=+23.662990185 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret") pod "global-pull-secret-syncer-nz48r" (UID: "ada98a0b-3e12-4c80-9534-e61848c60c06") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:51.158095 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:51.158070 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:51.158210 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:51.158086 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:51.158210 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:51.158183 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:51.158311 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:51.158263 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:51.332588 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:51.332375 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-42wsp" event={"ID":"e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea","Type":"ContainerStarted","Data":"e4b7bfa7eefe7e67b1a8c916ea7758ff650336309b9f426cdc339d200b77fa19"} Apr 16 14:52:51.337513 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:51.337194 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" event={"ID":"240ab929-6399-4d7c-b583-aef03c5cc884","Type":"ContainerStarted","Data":"1fad4b6a2a5939fd91b61619c0ba4f0f423ca50282c72bc0aed111e7cf776add"} Apr 16 14:52:52.158138 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.157959 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:52.158265 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:52.158221 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nz48r" podUID="ada98a0b-3e12-4c80-9534-e61848c60c06" Apr 16 14:52:52.344521 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.344495 2582 generic.go:358] "Generic (PLEG): container finished" podID="842ebc47-2cf2-4818-ab9a-d2f17f00f1da" containerID="e87509bc288e1c88a8bd67e9446948d63a65925c5631bd33d6a58d49674ff8c4" exitCode=0 Apr 16 14:52:52.345171 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.344561 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2w5z" event={"ID":"842ebc47-2cf2-4818-ab9a-d2f17f00f1da","Type":"ContainerDied","Data":"e87509bc288e1c88a8bd67e9446948d63a65925c5631bd33d6a58d49674ff8c4"} Apr 16 14:52:52.345871 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.345848 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xnxqc" event={"ID":"16a896b7-8655-40cd-b495-e93e72c07fb6","Type":"ContainerStarted","Data":"4e9f74cde35d86c08b197d9c62e7034aae9f608470f64bb011c6592a12022d5c"} Apr 16 14:52:52.347129 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.347108 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dm6rx" event={"ID":"e0e9dc65-caa5-437c-b911-96a930ff75fe","Type":"ContainerStarted","Data":"8ae4b16a4360ad23d05276dd2f6b4d06020ec2c0768385af04c4e81e07b8bbf6"} Apr 16 14:52:52.348331 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.348295 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6xl5n" event={"ID":"e4406ba0-59cd-4412-bdfe-3284d83e48a7","Type":"ContainerStarted","Data":"51e315458b4e67ed3906f1fdf2862901fc0f8744d4abf8b3a84ecdb7ea03c12f"} Apr 16 14:52:52.350509 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.350484 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 14:52:52.350758 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.350739 2582 generic.go:358] "Generic (PLEG): container finished" podID="08f2c22e-77d0-4250-843e-95a65b09af16" containerID="c8c6fe23ab1dca784cf5ff57429732645ee49feffcc484bcb57ac3bb47c7f427" exitCode=1 Apr 16 14:52:52.350808 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.350795 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" event={"ID":"08f2c22e-77d0-4250-843e-95a65b09af16","Type":"ContainerStarted","Data":"30c660a5ff3052da1e254a20f2451c8d0262aae626442daf5deba6e9b90b763c"} Apr 16 14:52:52.350872 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.350814 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" event={"ID":"08f2c22e-77d0-4250-843e-95a65b09af16","Type":"ContainerStarted","Data":"40498b8d12a0bedfeb0869f2aaa961028d83da2557df000cbec6f2e7e0c7c26d"} Apr 16 14:52:52.350872 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.350844 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" event={"ID":"08f2c22e-77d0-4250-843e-95a65b09af16","Type":"ContainerStarted","Data":"0204bf20be8e6cf8cd0008740a475fc3e0bcfb611ceeb48d634340f7b1f129db"} Apr 16 14:52:52.350872 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.350854 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" event={"ID":"08f2c22e-77d0-4250-843e-95a65b09af16","Type":"ContainerStarted","Data":"8e8e5de7b7f19819111ea94472b73c1f799a79e02a0de9e53262731aeccac447"} Apr 16 14:52:52.350872 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.350863 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" event={"ID":"08f2c22e-77d0-4250-843e-95a65b09af16","Type":"ContainerDied","Data":"c8c6fe23ab1dca784cf5ff57429732645ee49feffcc484bcb57ac3bb47c7f427"} Apr 16 14:52:52.350872 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.350871 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" event={"ID":"08f2c22e-77d0-4250-843e-95a65b09af16","Type":"ContainerStarted","Data":"2b6bb5f11f8bbb37906aad345bd6e9b5c934b6462a6b2b1e4f9d3743a04cf2d0"} Apr 16 14:52:52.351952 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.351936 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-z8k77" event={"ID":"d26e8a35-54b1-4862-87ad-cab47e12e62d","Type":"ContainerStarted","Data":"bb1c765cd0eb4d4ddca8632cc256a99b9fef77e44a908301fea23708872416da"} Apr 16 14:52:52.353093 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.353074 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t4w4v" event={"ID":"6157833b-8b66-48ab-a248-7d79d51cec48","Type":"ContainerStarted","Data":"5b0b1df96d5370d771724f7546cadf2adc708ff55d96540341ac344901bd61f9"} Apr 16 14:52:52.376161 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.376123 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-42wsp" podStartSLOduration=3.974202664 podStartE2EDuration="21.376111305s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:33.700995814 +0000 UTC m=+3.197323021" lastFinishedPulling="2026-04-16 14:52:51.10290445 +0000 UTC m=+20.599231662" observedRunningTime="2026-04-16 14:52:52.376054945 +0000 UTC m=+21.872382165" watchObservedRunningTime="2026-04-16 14:52:52.376111305 +0000 UTC m=+21.872438530" Apr 16 14:52:52.404769 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.404697 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-6xl5n" podStartSLOduration=4.201340362 podStartE2EDuration="21.404686281s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:33.690195423 +0000 UTC m=+3.186522630" lastFinishedPulling="2026-04-16 14:52:50.893541342 +0000 UTC m=+20.389868549" observedRunningTime="2026-04-16 14:52:52.391964733 +0000 UTC m=+21.888291955" watchObservedRunningTime="2026-04-16 14:52:52.404686281 +0000 UTC m=+21.901013510" Apr 16 14:52:52.405445 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.405065 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xnxqc" podStartSLOduration=4.212503951 podStartE2EDuration="21.405054269s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:33.700997022 +0000 UTC m=+3.197324232" lastFinishedPulling="2026-04-16 14:52:50.893547339 +0000 UTC m=+20.389874550" observedRunningTime="2026-04-16 14:52:52.405023708 +0000 UTC m=+21.901350936" watchObservedRunningTime="2026-04-16 14:52:52.405054269 +0000 UTC m=+21.901381499" Apr 16 14:52:52.426259 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.426226 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-z8k77" podStartSLOduration=4.015508918 podStartE2EDuration="21.426217575s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:33.694583606 +0000 UTC m=+3.190910830" lastFinishedPulling="2026-04-16 14:52:51.105292274 +0000 UTC m=+20.601619487" observedRunningTime="2026-04-16 14:52:52.426052606 +0000 UTC m=+21.922379834" watchObservedRunningTime="2026-04-16 14:52:52.426217575 +0000 UTC m=+21.922544803" Apr 16 14:52:52.442686 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.442657 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-t4w4v" podStartSLOduration=4.214911304 podStartE2EDuration="21.442650606s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:33.688528727 +0000 UTC m=+3.184855933" lastFinishedPulling="2026-04-16 14:52:50.916268025 +0000 UTC m=+20.412595235" observedRunningTime="2026-04-16 14:52:52.442646121 +0000 UTC m=+21.938973429" watchObservedRunningTime="2026-04-16 14:52:52.442650606 +0000 UTC m=+21.938977841" Apr 16 14:52:52.458137 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.458104 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dm6rx" podStartSLOduration=4.04526918 podStartE2EDuration="21.458095638s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:33.691523246 +0000 UTC m=+3.187850460" lastFinishedPulling="2026-04-16 14:52:51.104349699 +0000 UTC m=+20.600676918" observedRunningTime="2026-04-16 14:52:52.458005508 +0000 UTC m=+21.954332737" watchObservedRunningTime="2026-04-16 14:52:52.458095638 +0000 UTC m=+21.954422860" Apr 16 14:52:52.654810 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:52.654750 2582 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:52:53.080717 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:53.080544 2582 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:52:52.654770817Z","UUID":"deb9cb8f-1195-4a9a-8188-0fec4b25fdd4","Handler":null,"Name":"","Endpoint":""} Apr 16 14:52:53.084148 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:53.084110 2582 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:52:53.084148 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:53.084144 2582 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:52:53.157664 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:53.157634 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:53.157664 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:53.157664 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:53.157917 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:53.157754 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:53.157980 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:53.157931 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:53.357326 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:53.357239 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" event={"ID":"240ab929-6399-4d7c-b583-aef03c5cc884","Type":"ContainerStarted","Data":"d088069e6bb31fe3a0122e78ee5d92b565c69fe665d6b7baba3c347ba37c8944"} Apr 16 14:52:54.138168 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:54.137818 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:54.138362 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:54.138278 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:54.158194 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:54.158165 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:54.158351 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:54.158271 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nz48r" podUID="ada98a0b-3e12-4c80-9534-e61848c60c06" Apr 16 14:52:54.199755 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:54.199685 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:54.199900 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:54.199809 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:54.199900 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:54.199889 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret podName:ada98a0b-3e12-4c80-9534-e61848c60c06 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:02.199873776 +0000 UTC m=+31.696200984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret") pod "global-pull-secret-syncer-nz48r" (UID: "ada98a0b-3e12-4c80-9534-e61848c60c06") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:54.361778 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:54.361752 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 14:52:54.362210 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:54.362080 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" event={"ID":"08f2c22e-77d0-4250-843e-95a65b09af16","Type":"ContainerStarted","Data":"c63512d521c7af40ebdcdf7d4cfb21faa0b546559565da45c5a361cf69a33062"} Apr 16 14:52:54.364225 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:54.364202 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" event={"ID":"240ab929-6399-4d7c-b583-aef03c5cc884","Type":"ContainerStarted","Data":"84d60de430d2c813f9d86a0f6ebb430b736940c43f82f6f1e23987023fa97764"} Apr 16 14:52:54.364623 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:54.364424 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:54.364908 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:54.364893 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xnxqc" Apr 16 14:52:54.382670 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:54.382631 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7jxtj" podStartSLOduration=3.441735935 podStartE2EDuration="23.382614484s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:33.686175045 +0000 UTC m=+3.182502252" lastFinishedPulling="2026-04-16 14:52:53.627053589 +0000 UTC m=+23.123380801" observedRunningTime="2026-04-16 14:52:54.381849902 +0000 UTC m=+23.878177135" watchObservedRunningTime="2026-04-16 14:52:54.382614484 +0000 UTC m=+23.878941713" Apr 16 14:52:55.157581 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:55.157551 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:55.157772 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:55.157682 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:55.157772 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:55.157741 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:55.157893 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:55.157872 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:56.157692 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:56.157651 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:56.158283 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:56.157776 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nz48r" podUID="ada98a0b-3e12-4c80-9534-e61848c60c06" Apr 16 14:52:57.158219 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:57.158031 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:57.158858 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:57.158096 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:57.158858 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:57.158311 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:57.158858 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:57.158368 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:57.371366 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:57.371342 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 14:52:57.371654 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:57.371630 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" event={"ID":"08f2c22e-77d0-4250-843e-95a65b09af16","Type":"ContainerStarted","Data":"dc3924736d1457ebe1d096736b1d2ce3726f1a5027f41d41b27bdacb72d95b44"} Apr 16 14:52:57.371952 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:57.371931 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:57.372092 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:57.372075 2582 scope.go:117] "RemoveContainer" containerID="c8c6fe23ab1dca784cf5ff57429732645ee49feffcc484bcb57ac3bb47c7f427" Apr 16 14:52:57.373478 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:57.373454 2582 generic.go:358] "Generic (PLEG): container finished" podID="842ebc47-2cf2-4818-ab9a-d2f17f00f1da" containerID="91b1dbd3f6fd92c4636556c0011834c5c6971731d54a62aa0cd453389aad1e39" exitCode=0 Apr 16 14:52:57.373566 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:57.373496 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2w5z" event={"ID":"842ebc47-2cf2-4818-ab9a-d2f17f00f1da","Type":"ContainerDied","Data":"91b1dbd3f6fd92c4636556c0011834c5c6971731d54a62aa0cd453389aad1e39"} Apr 16 14:52:57.388093 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:57.388070 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:58.157861 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.157814 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:58.157986 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:58.157960 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nz48r" podUID="ada98a0b-3e12-4c80-9534-e61848c60c06" Apr 16 14:52:58.377081 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.377052 2582 generic.go:358] "Generic (PLEG): container finished" podID="842ebc47-2cf2-4818-ab9a-d2f17f00f1da" containerID="c6fd6270d08c1ca80535cfd86f5d16de788e7f7172875753fd7a652457fd34ac" exitCode=0 Apr 16 14:52:58.377451 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.377141 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2w5z" event={"ID":"842ebc47-2cf2-4818-ab9a-d2f17f00f1da","Type":"ContainerDied","Data":"c6fd6270d08c1ca80535cfd86f5d16de788e7f7172875753fd7a652457fd34ac"} Apr 16 14:52:58.380597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.380451 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 14:52:58.380934 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.380911 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" event={"ID":"08f2c22e-77d0-4250-843e-95a65b09af16","Type":"ContainerStarted","Data":"ff0410e72cd45f3020a54cd6d8424d5425323bbae1322f1d7213c220c9ec732d"} Apr 16 14:52:58.381056 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.381044 2582 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:52:58.381247 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.381232 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:58.399640 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.399617 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:52:58.401274 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.401251 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nz48r"] Apr 16 14:52:58.401400 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.401345 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:52:58.401474 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:58.401449 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nz48r" podUID="ada98a0b-3e12-4c80-9534-e61848c60c06" Apr 16 14:52:58.402098 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.402074 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zkgj2"] Apr 16 14:52:58.402198 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.402185 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:52:58.402302 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:58.402282 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:52:58.402862 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.402843 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6ltjv"] Apr 16 14:52:58.402942 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.402931 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:52:58.403027 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:52:58.403012 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:52:58.444036 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:58.443994 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" podStartSLOduration=9.984087584 podStartE2EDuration="27.44398066s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:33.694606715 +0000 UTC m=+3.190933926" lastFinishedPulling="2026-04-16 14:52:51.154499785 +0000 UTC m=+20.650827002" observedRunningTime="2026-04-16 14:52:58.442764197 +0000 UTC m=+27.939091426" watchObservedRunningTime="2026-04-16 14:52:58.44398066 +0000 UTC m=+27.940307889" Apr 16 14:52:59.384333 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:59.384303 2582 generic.go:358] "Generic (PLEG): container finished" podID="842ebc47-2cf2-4818-ab9a-d2f17f00f1da" containerID="975db5c474dcb179a68c435beec57dfcb6b185d9f96ff59c4ff61713ea223306" exitCode=0 Apr 16 14:52:59.384767 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:59.384391 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2w5z" event={"ID":"842ebc47-2cf2-4818-ab9a-d2f17f00f1da","Type":"ContainerDied","Data":"975db5c474dcb179a68c435beec57dfcb6b185d9f96ff59c4ff61713ea223306"} Apr 16 14:52:59.384767 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:52:59.384635 2582 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:53:00.158127 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:00.158079 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:53:00.158321 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:00.158080 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:53:00.158321 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:00.158226 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:53:00.158321 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:00.158292 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:53:00.158321 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:00.158090 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:53:00.158515 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:00.158379 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nz48r" podUID="ada98a0b-3e12-4c80-9534-e61848c60c06" Apr 16 14:53:00.386124 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:00.386039 2582 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:53:01.719003 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:01.718974 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:53:01.719587 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:01.719244 2582 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:53:01.731197 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:01.731149 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" podUID="08f2c22e-77d0-4250-843e-95a65b09af16" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 14:53:01.740428 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:01.740404 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" podUID="08f2c22e-77d0-4250-843e-95a65b09af16" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 14:53:02.157406 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:02.157365 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:53:02.157553 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:02.157365 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:53:02.157553 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:02.157477 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:53:02.157553 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:02.157544 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zkgj2" podUID="8c2c36da-9263-4f56-8f34-dd26c0ce00c9" Apr 16 14:53:02.157731 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:02.157381 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:53:02.157731 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:02.157670 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nz48r" podUID="ada98a0b-3e12-4c80-9534-e61848c60c06" Apr 16 14:53:02.265344 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:02.265314 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:53:02.265583 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:02.265467 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:53:02.265583 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:02.265540 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret podName:ada98a0b-3e12-4c80-9534-e61848c60c06 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:18.265520369 +0000 UTC m=+47.761847579 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret") pod "global-pull-secret-syncer-nz48r" (UID: "ada98a0b-3e12-4c80-9534-e61848c60c06") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:53:03.785462 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.785431 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-101.ec2.internal" event="NodeReady" Apr 16 14:53:03.785868 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.785579 2582 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:53:03.841914 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.841884 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8pv4t"] Apr 16 14:53:03.846520 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.846500 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:03.850140 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.850115 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jh4sv"] Apr 16 14:53:03.850891 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.850870 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:53:03.851029 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.851006 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:53:03.851141 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.851006 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vzfs6\"" Apr 16 14:53:03.852835 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.852806 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:03.855220 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.855201 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8pv4t"] Apr 16 14:53:03.856809 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.856791 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:53:03.856952 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.856840 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8jkdg\"" Apr 16 14:53:03.856952 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.856873 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:53:03.856952 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.856792 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:53:03.866065 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.866045 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jh4sv"] Apr 16 14:53:03.980229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.980158 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:03.980229 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.980196 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5spp\" (UniqueName: \"kubernetes.io/projected/87ff3a2a-3409-4acc-8192-f4db952ccdcf-kube-api-access-p5spp\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:03.980515 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.980244 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-tmp-dir\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:03.980515 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.980307 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-config-volume\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:03.980515 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.980342 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnb5v\" (UniqueName: \"kubernetes.io/projected/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-kube-api-access-fnb5v\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:03.980515 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:03.980408 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:04.081666 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.081633 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-tmp-dir\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:04.081878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.081709 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-config-volume\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:04.081878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.081743 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnb5v\" (UniqueName: \"kubernetes.io/projected/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-kube-api-access-fnb5v\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:04.081878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.081782 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:04.081878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.081844 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:04.081878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.081873 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5spp\" (UniqueName: \"kubernetes.io/projected/87ff3a2a-3409-4acc-8192-f4db952ccdcf-kube-api-access-p5spp\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:04.082126 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.082066 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-tmp-dir\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:04.082175 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:04.082162 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:04.082233 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:04.082219 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert podName:87ff3a2a-3409-4acc-8192-f4db952ccdcf nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.582198481 +0000 UTC m=+34.078525692 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert") pod "ingress-canary-jh4sv" (UID: "87ff3a2a-3409-4acc-8192-f4db952ccdcf") : secret "canary-serving-cert" not found Apr 16 14:53:04.082380 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:04.082308 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:04.082380 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.082330 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-config-volume\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:04.082380 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:04.082359 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls podName:4bcbf238-7f5c-47ac-b42a-0e299ed29df0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.5823447 +0000 UTC m=+34.078671908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls") pod "dns-default-8pv4t" (UID: "4bcbf238-7f5c-47ac-b42a-0e299ed29df0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:04.094116 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.094087 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnb5v\" (UniqueName: \"kubernetes.io/projected/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-kube-api-access-fnb5v\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:04.094237 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.094146 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5spp\" (UniqueName: \"kubernetes.io/projected/87ff3a2a-3409-4acc-8192-f4db952ccdcf-kube-api-access-p5spp\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:04.157734 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.157701 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:53:04.157884 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.157773 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:53:04.157884 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.157802 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:53:04.160607 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.160586 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:04.160733 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.160608 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kc8hc\"" Apr 16 14:53:04.160733 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.160618 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j5fbx\"" Apr 16 14:53:04.160880 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.160864 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:53:04.160944 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.160899 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:04.161131 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.161116 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:04.586871 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.586819 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:04.587074 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.586898 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:04.587074 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:04.586978 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:04.587074 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:04.587007 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:04.587074 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:04.587061 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert podName:87ff3a2a-3409-4acc-8192-f4db952ccdcf nodeName:}" failed. No retries permitted until 2026-04-16 14:53:05.587040905 +0000 UTC m=+35.083368126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert") pod "ingress-canary-jh4sv" (UID: "87ff3a2a-3409-4acc-8192-f4db952ccdcf") : secret "canary-serving-cert" not found Apr 16 14:53:04.587234 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:04.587087 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls podName:4bcbf238-7f5c-47ac-b42a-0e299ed29df0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:05.587069121 +0000 UTC m=+35.083396342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls") pod "dns-default-8pv4t" (UID: "4bcbf238-7f5c-47ac-b42a-0e299ed29df0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:04.788866 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.788804 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:53:04.789402 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:04.788965 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:04.789402 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:04.789033 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs podName:3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e nodeName:}" failed. No retries permitted until 2026-04-16 14:53:36.789014585 +0000 UTC m=+66.285341805 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs") pod "network-metrics-daemon-6ltjv" (UID: "3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e") : secret "metrics-daemon-secret" not found Apr 16 14:53:04.889473 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.889446 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42jct\" (UniqueName: \"kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct\") pod \"network-check-target-zkgj2\" (UID: \"8c2c36da-9263-4f56-8f34-dd26c0ce00c9\") " pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:53:04.892441 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:04.892417 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42jct\" (UniqueName: \"kubernetes.io/projected/8c2c36da-9263-4f56-8f34-dd26c0ce00c9-kube-api-access-42jct\") pod \"network-check-target-zkgj2\" (UID: \"8c2c36da-9263-4f56-8f34-dd26c0ce00c9\") " pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:53:05.079204 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:05.079162 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:53:05.230959 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:05.230728 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zkgj2"] Apr 16 14:53:05.269940 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:53:05.269910 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c2c36da_9263_4f56_8f34_dd26c0ce00c9.slice/crio-46e96a32200cb4861a31e96608b0c038b70a510f07f5e64444e4bb56d835278e WatchSource:0}: Error finding container 46e96a32200cb4861a31e96608b0c038b70a510f07f5e64444e4bb56d835278e: Status 404 returned error can't find the container with id 46e96a32200cb4861a31e96608b0c038b70a510f07f5e64444e4bb56d835278e Apr 16 14:53:05.396112 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:05.396079 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zkgj2" event={"ID":"8c2c36da-9263-4f56-8f34-dd26c0ce00c9","Type":"ContainerStarted","Data":"46e96a32200cb4861a31e96608b0c038b70a510f07f5e64444e4bb56d835278e"} Apr 16 14:53:05.595126 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:05.595099 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:05.595224 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:05.595146 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:05.595269 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:05.595234 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:05.595269 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:05.595238 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:05.595343 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:05.595289 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls podName:4bcbf238-7f5c-47ac-b42a-0e299ed29df0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:07.595276252 +0000 UTC m=+37.091603458 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls") pod "dns-default-8pv4t" (UID: "4bcbf238-7f5c-47ac-b42a-0e299ed29df0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:05.595343 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:05.595304 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert podName:87ff3a2a-3409-4acc-8192-f4db952ccdcf nodeName:}" failed. No retries permitted until 2026-04-16 14:53:07.59529763 +0000 UTC m=+37.091624838 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert") pod "ingress-canary-jh4sv" (UID: "87ff3a2a-3409-4acc-8192-f4db952ccdcf") : secret "canary-serving-cert" not found Apr 16 14:53:06.401372 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:06.401341 2582 generic.go:358] "Generic (PLEG): container finished" podID="842ebc47-2cf2-4818-ab9a-d2f17f00f1da" containerID="1049b16636b78f5aa2119c30242e9b943c21950969d3aa3a3678100287ec4c66" exitCode=0 Apr 16 14:53:06.401887 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:06.401405 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2w5z" event={"ID":"842ebc47-2cf2-4818-ab9a-d2f17f00f1da","Type":"ContainerDied","Data":"1049b16636b78f5aa2119c30242e9b943c21950969d3aa3a3678100287ec4c66"} Apr 16 14:53:07.406385 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:07.406354 2582 generic.go:358] "Generic (PLEG): container finished" podID="842ebc47-2cf2-4818-ab9a-d2f17f00f1da" containerID="6e567cde88220a390c245c9347c05fb0fb38b96fc006ab1544698c43e7a15502" exitCode=0 Apr 16 14:53:07.406768 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:07.406410 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2w5z" event={"ID":"842ebc47-2cf2-4818-ab9a-d2f17f00f1da","Type":"ContainerDied","Data":"6e567cde88220a390c245c9347c05fb0fb38b96fc006ab1544698c43e7a15502"} Apr 16 14:53:07.609131 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:07.609099 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:07.609306 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:07.609150 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:07.609306 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:07.609245 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:07.609306 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:07.609251 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:07.609306 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:07.609307 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert podName:87ff3a2a-3409-4acc-8192-f4db952ccdcf nodeName:}" failed. No retries permitted until 2026-04-16 14:53:11.609291224 +0000 UTC m=+41.105618432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert") pod "ingress-canary-jh4sv" (UID: "87ff3a2a-3409-4acc-8192-f4db952ccdcf") : secret "canary-serving-cert" not found Apr 16 14:53:07.609476 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:07.609320 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls podName:4bcbf238-7f5c-47ac-b42a-0e299ed29df0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:11.609314386 +0000 UTC m=+41.105641593 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls") pod "dns-default-8pv4t" (UID: "4bcbf238-7f5c-47ac-b42a-0e299ed29df0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:08.411727 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:08.411494 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2w5z" event={"ID":"842ebc47-2cf2-4818-ab9a-d2f17f00f1da","Type":"ContainerStarted","Data":"c313fad9faab7c1104dd3df6707db3367a345515a7994e3ed2c2f30bda699158"} Apr 16 14:53:08.434451 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:08.434409 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g2w5z" podStartSLOduration=5.838560187 podStartE2EDuration="37.434394576s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:52:33.701298586 +0000 UTC m=+3.197625793" lastFinishedPulling="2026-04-16 14:53:05.297132971 +0000 UTC m=+34.793460182" observedRunningTime="2026-04-16 14:53:08.432381819 +0000 UTC m=+37.928709048" watchObservedRunningTime="2026-04-16 14:53:08.434394576 +0000 UTC m=+37.930721805" Apr 16 14:53:09.414747 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:09.414709 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zkgj2" event={"ID":"8c2c36da-9263-4f56-8f34-dd26c0ce00c9","Type":"ContainerStarted","Data":"23246baa456a86eb10551ae6f9e805f42ce912b35fd3b1e5d60b58d22e3a4769"} Apr 16 14:53:09.415279 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:09.414873 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:53:09.430492 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:09.430451 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zkgj2" podStartSLOduration=35.295950925 podStartE2EDuration="38.430439943s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:53:05.275736977 +0000 UTC m=+34.772064184" lastFinishedPulling="2026-04-16 14:53:08.410225991 +0000 UTC m=+37.906553202" observedRunningTime="2026-04-16 14:53:09.429737584 +0000 UTC m=+38.926064815" watchObservedRunningTime="2026-04-16 14:53:09.430439943 +0000 UTC m=+38.926767166" Apr 16 14:53:11.634172 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:11.634131 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:11.634686 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:11.634184 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:11.634686 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:11.634270 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:11.634686 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:11.634308 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:11.634686 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:11.634333 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert podName:87ff3a2a-3409-4acc-8192-f4db952ccdcf nodeName:}" failed. No retries permitted until 2026-04-16 14:53:19.634318674 +0000 UTC m=+49.130645881 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert") pod "ingress-canary-jh4sv" (UID: "87ff3a2a-3409-4acc-8192-f4db952ccdcf") : secret "canary-serving-cert" not found Apr 16 14:53:11.634686 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:11.634361 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls podName:4bcbf238-7f5c-47ac-b42a-0e299ed29df0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:19.634344414 +0000 UTC m=+49.130671621 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls") pod "dns-default-8pv4t" (UID: "4bcbf238-7f5c-47ac-b42a-0e299ed29df0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:18.272649 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:18.272611 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:53:18.278773 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:18.278745 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ada98a0b-3e12-4c80-9534-e61848c60c06-original-pull-secret\") pod \"global-pull-secret-syncer-nz48r\" (UID: \"ada98a0b-3e12-4c80-9534-e61848c60c06\") " pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:53:18.571566 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:18.571496 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nz48r" Apr 16 14:53:18.701755 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:18.701723 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nz48r"] Apr 16 14:53:18.704556 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:53:18.704530 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podada98a0b_3e12_4c80_9534_e61848c60c06.slice/crio-ed8601af37aea60c7e4373ccf3f6678abf1c3668b04a5b258013ea5befaa359a WatchSource:0}: Error finding container ed8601af37aea60c7e4373ccf3f6678abf1c3668b04a5b258013ea5befaa359a: Status 404 returned error can't find the container with id ed8601af37aea60c7e4373ccf3f6678abf1c3668b04a5b258013ea5befaa359a Apr 16 14:53:19.434809 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:19.434764 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nz48r" event={"ID":"ada98a0b-3e12-4c80-9534-e61848c60c06","Type":"ContainerStarted","Data":"ed8601af37aea60c7e4373ccf3f6678abf1c3668b04a5b258013ea5befaa359a"} Apr 16 14:53:19.682204 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:19.682166 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:19.682373 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:19.682220 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:19.682373 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:19.682327 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:19.682373 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:19.682340 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:19.682525 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:19.682405 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert podName:87ff3a2a-3409-4acc-8192-f4db952ccdcf nodeName:}" failed. No retries permitted until 2026-04-16 14:53:35.682380548 +0000 UTC m=+65.178707758 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert") pod "ingress-canary-jh4sv" (UID: "87ff3a2a-3409-4acc-8192-f4db952ccdcf") : secret "canary-serving-cert" not found Apr 16 14:53:19.682525 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:19.682435 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls podName:4bcbf238-7f5c-47ac-b42a-0e299ed29df0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:35.682416791 +0000 UTC m=+65.178743998 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls") pod "dns-default-8pv4t" (UID: "4bcbf238-7f5c-47ac-b42a-0e299ed29df0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:23.443077 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:23.443039 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nz48r" event={"ID":"ada98a0b-3e12-4c80-9534-e61848c60c06","Type":"ContainerStarted","Data":"0f44b5e98cbc98c438da2ad0c3d4bcb605273d32fb0cbcba7df1dcc8ae634211"} Apr 16 14:53:23.457808 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:23.457763 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nz48r" podStartSLOduration=33.570891953 podStartE2EDuration="37.45775047s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:53:18.706363859 +0000 UTC m=+48.202691081" lastFinishedPulling="2026-04-16 14:53:22.593222391 +0000 UTC m=+52.089549598" observedRunningTime="2026-04-16 14:53:23.456920965 +0000 UTC m=+52.953248194" watchObservedRunningTime="2026-04-16 14:53:23.45775047 +0000 UTC m=+52.954077699" Apr 16 14:53:31.740533 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:31.740504 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fzdmh" Apr 16 14:53:35.684121 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:35.684084 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:53:35.684121 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:35.684132 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:53:35.684602 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:35.684232 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:35.684602 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:35.684235 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:35.684602 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:35.684296 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert podName:87ff3a2a-3409-4acc-8192-f4db952ccdcf nodeName:}" failed. No retries permitted until 2026-04-16 14:54:07.684280155 +0000 UTC m=+97.180607363 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert") pod "ingress-canary-jh4sv" (UID: "87ff3a2a-3409-4acc-8192-f4db952ccdcf") : secret "canary-serving-cert" not found Apr 16 14:53:35.684602 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:35.684313 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls podName:4bcbf238-7f5c-47ac-b42a-0e299ed29df0 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:07.684305748 +0000 UTC m=+97.180632956 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls") pod "dns-default-8pv4t" (UID: "4bcbf238-7f5c-47ac-b42a-0e299ed29df0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:36.791082 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:36.791039 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:53:36.791434 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:36.791178 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:36.791434 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:53:36.791254 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs podName:3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.791238769 +0000 UTC m=+130.287565976 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs") pod "network-metrics-daemon-6ltjv" (UID: "3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e") : secret "metrics-daemon-secret" not found Apr 16 14:53:40.419082 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:53:40.419046 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zkgj2" Apr 16 14:54:07.692306 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:54:07.692261 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:54:07.692306 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:54:07.692319 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:54:07.692868 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:54:07.692411 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:54:07.692868 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:54:07.692480 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert podName:87ff3a2a-3409-4acc-8192-f4db952ccdcf nodeName:}" failed. No retries permitted until 2026-04-16 14:55:11.692464996 +0000 UTC m=+161.188792203 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert") pod "ingress-canary-jh4sv" (UID: "87ff3a2a-3409-4acc-8192-f4db952ccdcf") : secret "canary-serving-cert" not found Apr 16 14:54:07.692868 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:54:07.692411 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:54:07.692868 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:54:07.692560 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls podName:4bcbf238-7f5c-47ac-b42a-0e299ed29df0 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:11.692549387 +0000 UTC m=+161.188876598 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls") pod "dns-default-8pv4t" (UID: "4bcbf238-7f5c-47ac-b42a-0e299ed29df0") : secret "dns-default-metrics-tls" not found Apr 16 14:54:40.804007 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:54:40.803968 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:54:40.804487 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:54:40.804106 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:54:40.804487 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:54:40.804169 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs podName:3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e nodeName:}" failed. No retries permitted until 2026-04-16 14:56:42.804155109 +0000 UTC m=+252.300482316 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs") pod "network-metrics-daemon-6ltjv" (UID: "3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e") : secret "metrics-daemon-secret" not found Apr 16 14:55:00.528775 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.528746 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-n7xxg"] Apr 16 14:55:00.531580 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.531560 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.531698 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.531577 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-b9d79f856-c56q2"] Apr 16 14:55:00.534026 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.534005 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 14:55:00.534127 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.534075 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-lmnhc\"" Apr 16 14:55:00.534315 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.534297 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.534397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.534297 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 14:55:00.534397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.534348 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.537076 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.535859 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.537659 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.537583 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-46hmh\"" Apr 16 14:55:00.538602 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.538534 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 14:55:00.538602 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.538586 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 14:55:00.539442 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.539133 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.539442 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.539435 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 14:55:00.539727 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.539673 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.541156 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.540740 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 14:55:00.542427 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.542405 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-n7xxg"] Apr 16 14:55:00.543721 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.543698 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 14:55:00.546619 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.546602 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b9d79f856-c56q2"] Apr 16 14:55:00.635127 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.635101 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/605f32ed-53b6-48df-8568-937ded360dd9-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.635227 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.635132 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/605f32ed-53b6-48df-8568-937ded360dd9-serving-cert\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.635227 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.635149 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.635227 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.635172 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/605f32ed-53b6-48df-8568-937ded360dd9-tmp\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.635227 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.635210 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvnm5\" (UniqueName: \"kubernetes.io/projected/605f32ed-53b6-48df-8568-937ded360dd9-kube-api-access-tvnm5\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.635227 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.635226 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-stats-auth\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.635396 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.635247 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4w9\" (UniqueName: \"kubernetes.io/projected/5b6bb918-ac04-4256-9242-dd810c7e754e-kube-api-access-hd4w9\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.635396 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.635298 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/605f32ed-53b6-48df-8568-937ded360dd9-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.635396 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.635337 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-default-certificate\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.635396 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.635366 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/605f32ed-53b6-48df-8568-937ded360dd9-snapshots\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.635396 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.635394 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.736399 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736374 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/605f32ed-53b6-48df-8568-937ded360dd9-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.736501 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736402 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/605f32ed-53b6-48df-8568-937ded360dd9-serving-cert\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.736501 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736417 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.736501 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736433 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/605f32ed-53b6-48df-8568-937ded360dd9-tmp\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.736501 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736485 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvnm5\" (UniqueName: \"kubernetes.io/projected/605f32ed-53b6-48df-8568-937ded360dd9-kube-api-access-tvnm5\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.736706 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736518 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-stats-auth\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.736706 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736577 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4w9\" (UniqueName: \"kubernetes.io/projected/5b6bb918-ac04-4256-9242-dd810c7e754e-kube-api-access-hd4w9\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.736706 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:00.736525 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:55:00.736706 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736609 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/605f32ed-53b6-48df-8568-937ded360dd9-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.736706 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736642 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-default-certificate\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.736706 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:00.736677 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs podName:5b6bb918-ac04-4256-9242-dd810c7e754e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:01.236656762 +0000 UTC m=+150.732983973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs") pod "router-default-b9d79f856-c56q2" (UID: "5b6bb918-ac04-4256-9242-dd810c7e754e") : secret "router-metrics-certs-default" not found Apr 16 14:55:00.737043 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736713 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/605f32ed-53b6-48df-8568-937ded360dd9-snapshots\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.737043 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736737 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.737043 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.736806 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/605f32ed-53b6-48df-8568-937ded360dd9-tmp\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.737043 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:00.736912 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle podName:5b6bb918-ac04-4256-9242-dd810c7e754e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:01.236897676 +0000 UTC m=+150.733224888 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle") pod "router-default-b9d79f856-c56q2" (UID: "5b6bb918-ac04-4256-9242-dd810c7e754e") : configmap references non-existent config key: service-ca.crt Apr 16 14:55:00.738376 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.738354 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/605f32ed-53b6-48df-8568-937ded360dd9-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.739375 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.739345 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-stats-auth\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.739569 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.739554 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-default-certificate\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.741911 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.741890 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/605f32ed-53b6-48df-8568-937ded360dd9-serving-cert\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.741911 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.741904 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/605f32ed-53b6-48df-8568-937ded360dd9-snapshots\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.742025 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.741956 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/605f32ed-53b6-48df-8568-937ded360dd9-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.742128 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.742112 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss"] Apr 16 14:55:00.746038 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.746022 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:00.751211 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.751192 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-lljr5\"" Apr 16 14:55:00.751310 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.751227 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.751310 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.751201 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 14:55:00.751420 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.751327 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.762296 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.762276 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss"] Apr 16 14:55:00.769596 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.769574 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvnm5\" (UniqueName: \"kubernetes.io/projected/605f32ed-53b6-48df-8568-937ded360dd9-kube-api-access-tvnm5\") pod \"insights-operator-5785d4fcdd-n7xxg\" (UID: \"605f32ed-53b6-48df-8568-937ded360dd9\") " pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.772938 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.772920 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4w9\" (UniqueName: \"kubernetes.io/projected/5b6bb918-ac04-4256-9242-dd810c7e754e-kube-api-access-hd4w9\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:00.830254 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.830191 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss"] Apr 16 14:55:00.832967 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.832954 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" Apr 16 14:55:00.835267 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.835225 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 14:55:00.835359 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.835343 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 14:55:00.835422 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.835368 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bjjgn\"" Apr 16 14:55:00.835603 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.835589 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.835694 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.835661 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.842328 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.842310 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss"] Apr 16 14:55:00.844613 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.844595 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" Apr 16 14:55:00.937798 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.937763 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n55kt\" (UniqueName: \"kubernetes.io/projected/d385c18d-5b59-4b98-a975-22594471a5b7-kube-api-access-n55kt\") pod \"service-ca-operator-69965bb79d-647ss\" (UID: \"d385c18d-5b59-4b98-a975-22594471a5b7\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" Apr 16 14:55:00.937929 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.937907 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-5qmss\" (UID: \"bcdd05bc-1066-4ec8-b940-2a95cd94c623\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:00.937983 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.937934 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzjwm\" (UniqueName: \"kubernetes.io/projected/bcdd05bc-1066-4ec8-b940-2a95cd94c623-kube-api-access-zzjwm\") pod \"cluster-samples-operator-667775844f-5qmss\" (UID: \"bcdd05bc-1066-4ec8-b940-2a95cd94c623\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:00.937983 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.937951 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d385c18d-5b59-4b98-a975-22594471a5b7-serving-cert\") pod \"service-ca-operator-69965bb79d-647ss\" (UID: \"d385c18d-5b59-4b98-a975-22594471a5b7\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" Apr 16 14:55:00.938047 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.937988 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d385c18d-5b59-4b98-a975-22594471a5b7-config\") pod \"service-ca-operator-69965bb79d-647ss\" (UID: \"d385c18d-5b59-4b98-a975-22594471a5b7\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" Apr 16 14:55:00.959482 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:00.959455 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-n7xxg"] Apr 16 14:55:00.962218 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:00.962194 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod605f32ed_53b6_48df_8568_937ded360dd9.slice/crio-591aabeecb3a2f51a6b0364ec2e04cf5776423fed46ab4ec3838571d7ee7e01b WatchSource:0}: Error finding container 591aabeecb3a2f51a6b0364ec2e04cf5776423fed46ab4ec3838571d7ee7e01b: Status 404 returned error can't find the container with id 591aabeecb3a2f51a6b0364ec2e04cf5776423fed46ab4ec3838571d7ee7e01b Apr 16 14:55:01.038590 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.038564 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-5qmss\" (UID: \"bcdd05bc-1066-4ec8-b940-2a95cd94c623\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:01.038685 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.038590 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzjwm\" (UniqueName: \"kubernetes.io/projected/bcdd05bc-1066-4ec8-b940-2a95cd94c623-kube-api-access-zzjwm\") pod \"cluster-samples-operator-667775844f-5qmss\" (UID: \"bcdd05bc-1066-4ec8-b940-2a95cd94c623\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:01.038685 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.038614 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d385c18d-5b59-4b98-a975-22594471a5b7-serving-cert\") pod \"service-ca-operator-69965bb79d-647ss\" (UID: \"d385c18d-5b59-4b98-a975-22594471a5b7\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" Apr 16 14:55:01.038685 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.038654 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d385c18d-5b59-4b98-a975-22594471a5b7-config\") pod \"service-ca-operator-69965bb79d-647ss\" (UID: \"d385c18d-5b59-4b98-a975-22594471a5b7\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" Apr 16 14:55:01.038814 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.038702 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n55kt\" (UniqueName: \"kubernetes.io/projected/d385c18d-5b59-4b98-a975-22594471a5b7-kube-api-access-n55kt\") pod \"service-ca-operator-69965bb79d-647ss\" (UID: \"d385c18d-5b59-4b98-a975-22594471a5b7\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" Apr 16 14:55:01.038814 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:01.038710 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:55:01.038814 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:01.038771 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls podName:bcdd05bc-1066-4ec8-b940-2a95cd94c623 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:01.538753771 +0000 UTC m=+151.035080999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls") pod "cluster-samples-operator-667775844f-5qmss" (UID: "bcdd05bc-1066-4ec8-b940-2a95cd94c623") : secret "samples-operator-tls" not found Apr 16 14:55:01.039213 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.039194 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d385c18d-5b59-4b98-a975-22594471a5b7-config\") pod \"service-ca-operator-69965bb79d-647ss\" (UID: \"d385c18d-5b59-4b98-a975-22594471a5b7\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" Apr 16 14:55:01.040846 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.040794 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d385c18d-5b59-4b98-a975-22594471a5b7-serving-cert\") pod \"service-ca-operator-69965bb79d-647ss\" (UID: \"d385c18d-5b59-4b98-a975-22594471a5b7\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" Apr 16 14:55:01.047939 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.047918 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n55kt\" (UniqueName: \"kubernetes.io/projected/d385c18d-5b59-4b98-a975-22594471a5b7-kube-api-access-n55kt\") pod \"service-ca-operator-69965bb79d-647ss\" (UID: \"d385c18d-5b59-4b98-a975-22594471a5b7\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" Apr 16 14:55:01.048445 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.048424 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzjwm\" (UniqueName: \"kubernetes.io/projected/bcdd05bc-1066-4ec8-b940-2a95cd94c623-kube-api-access-zzjwm\") pod \"cluster-samples-operator-667775844f-5qmss\" (UID: \"bcdd05bc-1066-4ec8-b940-2a95cd94c623\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:01.141070 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.141053 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" Apr 16 14:55:01.239545 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.239518 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:01.239655 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.239582 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:01.239706 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:01.239694 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:55:01.239741 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:01.239705 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle podName:5b6bb918-ac04-4256-9242-dd810c7e754e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:02.239670716 +0000 UTC m=+151.735997922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle") pod "router-default-b9d79f856-c56q2" (UID: "5b6bb918-ac04-4256-9242-dd810c7e754e") : configmap references non-existent config key: service-ca.crt Apr 16 14:55:01.239786 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:01.239756 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs podName:5b6bb918-ac04-4256-9242-dd810c7e754e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:02.239739727 +0000 UTC m=+151.736066934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs") pod "router-default-b9d79f856-c56q2" (UID: "5b6bb918-ac04-4256-9242-dd810c7e754e") : secret "router-metrics-certs-default" not found Apr 16 14:55:01.253057 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.253033 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss"] Apr 16 14:55:01.256694 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:01.256665 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd385c18d_5b59_4b98_a975_22594471a5b7.slice/crio-6c2d5d5fd6558e21ad070b0e97aa775973c05c52944977e1eaf548cac4258172 WatchSource:0}: Error finding container 6c2d5d5fd6558e21ad070b0e97aa775973c05c52944977e1eaf548cac4258172: Status 404 returned error can't find the container with id 6c2d5d5fd6558e21ad070b0e97aa775973c05c52944977e1eaf548cac4258172 Apr 16 14:55:01.542192 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.542119 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-5qmss\" (UID: \"bcdd05bc-1066-4ec8-b940-2a95cd94c623\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:01.542572 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:01.542277 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:55:01.542572 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:01.542339 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls podName:bcdd05bc-1066-4ec8-b940-2a95cd94c623 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:02.542323369 +0000 UTC m=+152.038650579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls") pod "cluster-samples-operator-667775844f-5qmss" (UID: "bcdd05bc-1066-4ec8-b940-2a95cd94c623") : secret "samples-operator-tls" not found Apr 16 14:55:01.623810 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.623768 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" event={"ID":"605f32ed-53b6-48df-8568-937ded360dd9","Type":"ContainerStarted","Data":"591aabeecb3a2f51a6b0364ec2e04cf5776423fed46ab4ec3838571d7ee7e01b"} Apr 16 14:55:01.624891 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:01.624857 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" event={"ID":"d385c18d-5b59-4b98-a975-22594471a5b7","Type":"ContainerStarted","Data":"6c2d5d5fd6558e21ad070b0e97aa775973c05c52944977e1eaf548cac4258172"} Apr 16 14:55:02.247304 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:02.247266 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:02.247462 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:02.247363 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:02.247462 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:02.247451 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle podName:5b6bb918-ac04-4256-9242-dd810c7e754e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:04.247434071 +0000 UTC m=+153.743761283 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle") pod "router-default-b9d79f856-c56q2" (UID: "5b6bb918-ac04-4256-9242-dd810c7e754e") : configmap references non-existent config key: service-ca.crt Apr 16 14:55:02.247563 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:02.247504 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:55:02.247563 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:02.247562 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs podName:5b6bb918-ac04-4256-9242-dd810c7e754e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:04.247548895 +0000 UTC m=+153.743876105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs") pod "router-default-b9d79f856-c56q2" (UID: "5b6bb918-ac04-4256-9242-dd810c7e754e") : secret "router-metrics-certs-default" not found Apr 16 14:55:02.549329 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:02.549239 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-5qmss\" (UID: \"bcdd05bc-1066-4ec8-b940-2a95cd94c623\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:02.549721 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:02.549378 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:55:02.549721 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:02.549443 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls podName:bcdd05bc-1066-4ec8-b940-2a95cd94c623 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:04.549426156 +0000 UTC m=+154.045753365 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls") pod "cluster-samples-operator-667775844f-5qmss" (UID: "bcdd05bc-1066-4ec8-b940-2a95cd94c623") : secret "samples-operator-tls" not found Apr 16 14:55:03.630429 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:03.630396 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" event={"ID":"605f32ed-53b6-48df-8568-937ded360dd9","Type":"ContainerStarted","Data":"ad8381ada1bc971c8a4580b61d84753d4041a56dbab2d9682ede92285305119a"} Apr 16 14:55:03.631746 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:03.631724 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" event={"ID":"d385c18d-5b59-4b98-a975-22594471a5b7","Type":"ContainerStarted","Data":"b67165c98d059c2de3250de59c75d6eaf0e2dfa3d923a34cb43a6cef93432b8d"} Apr 16 14:55:03.646699 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:03.646649 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" podStartSLOduration=1.327412132 podStartE2EDuration="3.646634878s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:00.963811599 +0000 UTC m=+150.460138806" lastFinishedPulling="2026-04-16 14:55:03.283034345 +0000 UTC m=+152.779361552" observedRunningTime="2026-04-16 14:55:03.645984093 +0000 UTC m=+153.142311320" watchObservedRunningTime="2026-04-16 14:55:03.646634878 +0000 UTC m=+153.142962104" Apr 16 14:55:03.660332 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:03.660293 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" podStartSLOduration=1.633788952 podStartE2EDuration="3.660281401s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:01.258875704 +0000 UTC m=+150.755202912" lastFinishedPulling="2026-04-16 14:55:03.285368151 +0000 UTC m=+152.781695361" observedRunningTime="2026-04-16 14:55:03.6596383 +0000 UTC m=+153.155965534" watchObservedRunningTime="2026-04-16 14:55:03.660281401 +0000 UTC m=+153.156608630" Apr 16 14:55:04.011413 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.011341 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-s7ggn"] Apr 16 14:55:04.014495 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.014481 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-s7ggn" Apr 16 14:55:04.016989 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.016960 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-5fng9\"" Apr 16 14:55:04.025049 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.025026 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-s7ggn"] Apr 16 14:55:04.162675 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.162641 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4md22\" (UniqueName: \"kubernetes.io/projected/7a22f03f-cc7f-4423-a17c-e15045b60dbe-kube-api-access-4md22\") pod \"network-check-source-7b678d77c7-s7ggn\" (UID: \"7a22f03f-cc7f-4423-a17c-e15045b60dbe\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-s7ggn" Apr 16 14:55:04.263650 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.263590 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:04.263650 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.263639 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4md22\" (UniqueName: \"kubernetes.io/projected/7a22f03f-cc7f-4423-a17c-e15045b60dbe-kube-api-access-4md22\") pod \"network-check-source-7b678d77c7-s7ggn\" (UID: \"7a22f03f-cc7f-4423-a17c-e15045b60dbe\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-s7ggn" Apr 16 14:55:04.263849 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.263666 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:04.263849 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:04.263747 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:55:04.263849 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:04.263782 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle podName:5b6bb918-ac04-4256-9242-dd810c7e754e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:08.263768648 +0000 UTC m=+157.760095856 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle") pod "router-default-b9d79f856-c56q2" (UID: "5b6bb918-ac04-4256-9242-dd810c7e754e") : configmap references non-existent config key: service-ca.crt Apr 16 14:55:04.263849 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:04.263804 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs podName:5b6bb918-ac04-4256-9242-dd810c7e754e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:08.263788382 +0000 UTC m=+157.760115595 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs") pod "router-default-b9d79f856-c56q2" (UID: "5b6bb918-ac04-4256-9242-dd810c7e754e") : secret "router-metrics-certs-default" not found Apr 16 14:55:04.272245 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.272223 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4md22\" (UniqueName: \"kubernetes.io/projected/7a22f03f-cc7f-4423-a17c-e15045b60dbe-kube-api-access-4md22\") pod \"network-check-source-7b678d77c7-s7ggn\" (UID: \"7a22f03f-cc7f-4423-a17c-e15045b60dbe\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-s7ggn" Apr 16 14:55:04.322643 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.322621 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-s7ggn" Apr 16 14:55:04.452018 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:04.451984 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a22f03f_cc7f_4423_a17c_e15045b60dbe.slice/crio-660ed6796665da6ebb4ba5b9b57c24cc1f8ae1664c33af0c8e2d4e339fc4f25e WatchSource:0}: Error finding container 660ed6796665da6ebb4ba5b9b57c24cc1f8ae1664c33af0c8e2d4e339fc4f25e: Status 404 returned error can't find the container with id 660ed6796665da6ebb4ba5b9b57c24cc1f8ae1664c33af0c8e2d4e339fc4f25e Apr 16 14:55:04.452467 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.452432 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-s7ggn"] Apr 16 14:55:04.566802 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.566714 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-5qmss\" (UID: \"bcdd05bc-1066-4ec8-b940-2a95cd94c623\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:04.566978 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:04.566875 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:55:04.566978 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:04.566943 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls podName:bcdd05bc-1066-4ec8-b940-2a95cd94c623 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:08.566921387 +0000 UTC m=+158.063248595 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls") pod "cluster-samples-operator-667775844f-5qmss" (UID: "bcdd05bc-1066-4ec8-b940-2a95cd94c623") : secret "samples-operator-tls" not found Apr 16 14:55:04.636190 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.636148 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-s7ggn" event={"ID":"7a22f03f-cc7f-4423-a17c-e15045b60dbe","Type":"ContainerStarted","Data":"b60d9d423549d846f8bb8a2f2f801f3d36e8cb07b9524f1d5919689a9b6cdd90"} Apr 16 14:55:04.636190 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.636193 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-s7ggn" event={"ID":"7a22f03f-cc7f-4423-a17c-e15045b60dbe","Type":"ContainerStarted","Data":"660ed6796665da6ebb4ba5b9b57c24cc1f8ae1664c33af0c8e2d4e339fc4f25e"} Apr 16 14:55:04.652729 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.652692 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-s7ggn" podStartSLOduration=1.652678696 podStartE2EDuration="1.652678696s" podCreationTimestamp="2026-04-16 14:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:04.652079678 +0000 UTC m=+154.148406906" watchObservedRunningTime="2026-04-16 14:55:04.652678696 +0000 UTC m=+154.149005924" Apr 16 14:55:04.828782 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.828704 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt"] Apr 16 14:55:04.831796 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.831775 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt" Apr 16 14:55:04.834288 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.834267 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 14:55:04.834404 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.834297 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:04.834404 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.834327 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-p4zw7\"" Apr 16 14:55:04.839902 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.839879 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt"] Apr 16 14:55:04.970413 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:04.970381 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxxqf\" (UniqueName: \"kubernetes.io/projected/c8cffb83-efd8-43b7-9c39-bf80135be6c8-kube-api-access-zxxqf\") pod \"migrator-64d4d94569-nssxt\" (UID: \"c8cffb83-efd8-43b7-9c39-bf80135be6c8\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt" Apr 16 14:55:05.071502 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:05.071473 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxxqf\" (UniqueName: \"kubernetes.io/projected/c8cffb83-efd8-43b7-9c39-bf80135be6c8-kube-api-access-zxxqf\") pod \"migrator-64d4d94569-nssxt\" (UID: \"c8cffb83-efd8-43b7-9c39-bf80135be6c8\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt" Apr 16 14:55:05.078982 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:05.078928 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxxqf\" (UniqueName: \"kubernetes.io/projected/c8cffb83-efd8-43b7-9c39-bf80135be6c8-kube-api-access-zxxqf\") pod \"migrator-64d4d94569-nssxt\" (UID: \"c8cffb83-efd8-43b7-9c39-bf80135be6c8\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt" Apr 16 14:55:05.141987 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:05.141962 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt" Apr 16 14:55:05.273504 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:05.273474 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt"] Apr 16 14:55:05.277017 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:05.276991 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8cffb83_efd8_43b7_9c39_bf80135be6c8.slice/crio-71c077ff94be30fe67c0d16937e8cafc62692efe166913a09dd0828aa3eca7d4 WatchSource:0}: Error finding container 71c077ff94be30fe67c0d16937e8cafc62692efe166913a09dd0828aa3eca7d4: Status 404 returned error can't find the container with id 71c077ff94be30fe67c0d16937e8cafc62692efe166913a09dd0828aa3eca7d4 Apr 16 14:55:05.639720 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:05.639692 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt" event={"ID":"c8cffb83-efd8-43b7-9c39-bf80135be6c8","Type":"ContainerStarted","Data":"71c077ff94be30fe67c0d16937e8cafc62692efe166913a09dd0828aa3eca7d4"} Apr 16 14:55:06.643269 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:06.643234 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt" event={"ID":"c8cffb83-efd8-43b7-9c39-bf80135be6c8","Type":"ContainerStarted","Data":"56378586692b51b2436ea9103e6389245c59857b9903d1e74fde90986fd353a1"} Apr 16 14:55:06.643694 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:06.643275 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt" event={"ID":"c8cffb83-efd8-43b7-9c39-bf80135be6c8","Type":"ContainerStarted","Data":"5e6bfd1381a071b69f394c0d3bbe5e5f5ad1c0f26a064e591d9e37d623d560f0"} Apr 16 14:55:06.677597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:06.677556 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-nssxt" podStartSLOduration=1.597423305 podStartE2EDuration="2.677542267s" podCreationTimestamp="2026-04-16 14:55:04 +0000 UTC" firstStartedPulling="2026-04-16 14:55:05.278769608 +0000 UTC m=+154.775096816" lastFinishedPulling="2026-04-16 14:55:06.358888566 +0000 UTC m=+155.855215778" observedRunningTime="2026-04-16 14:55:06.675975963 +0000 UTC m=+156.172303194" watchObservedRunningTime="2026-04-16 14:55:06.677542267 +0000 UTC m=+156.173869493" Apr 16 14:55:06.780312 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:06.780285 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-z8k77_d26e8a35-54b1-4862-87ad-cab47e12e62d/dns-node-resolver/0.log" Apr 16 14:55:06.861905 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:06.861870 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8pv4t" podUID="4bcbf238-7f5c-47ac-b42a-0e299ed29df0" Apr 16 14:55:06.867986 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:06.867961 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jh4sv" podUID="87ff3a2a-3409-4acc-8192-f4db952ccdcf" Apr 16 14:55:07.183587 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:07.183553 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-6ltjv" podUID="3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e" Apr 16 14:55:07.645609 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:07.645586 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:55:07.645973 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:07.645609 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8pv4t" Apr 16 14:55:07.780010 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:07.779987 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-t4w4v_6157833b-8b66-48ab-a248-7d79d51cec48/node-ca/0.log" Apr 16 14:55:08.294302 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:08.294227 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:08.294302 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:08.294288 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:08.294467 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:08.294369 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:55:08.294467 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:08.294388 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle podName:5b6bb918-ac04-4256-9242-dd810c7e754e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:16.294372164 +0000 UTC m=+165.790699376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle") pod "router-default-b9d79f856-c56q2" (UID: "5b6bb918-ac04-4256-9242-dd810c7e754e") : configmap references non-existent config key: service-ca.crt Apr 16 14:55:08.294467 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:08.294419 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs podName:5b6bb918-ac04-4256-9242-dd810c7e754e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:16.294405535 +0000 UTC m=+165.790732746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs") pod "router-default-b9d79f856-c56q2" (UID: "5b6bb918-ac04-4256-9242-dd810c7e754e") : secret "router-metrics-certs-default" not found Apr 16 14:55:08.596896 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:08.596810 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-5qmss\" (UID: \"bcdd05bc-1066-4ec8-b940-2a95cd94c623\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:08.597027 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:08.596950 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:55:08.597027 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:08.597011 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls podName:bcdd05bc-1066-4ec8-b940-2a95cd94c623 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:16.596996806 +0000 UTC m=+166.093324014 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls") pod "cluster-samples-operator-667775844f-5qmss" (UID: "bcdd05bc-1066-4ec8-b940-2a95cd94c623") : secret "samples-operator-tls" not found Apr 16 14:55:11.718793 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:11.718757 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:55:11.719244 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:11.718814 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:55:11.719244 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:11.718929 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:55:11.719244 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:11.718983 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert podName:87ff3a2a-3409-4acc-8192-f4db952ccdcf nodeName:}" failed. No retries permitted until 2026-04-16 14:57:13.718967705 +0000 UTC m=+283.215294911 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert") pod "ingress-canary-jh4sv" (UID: "87ff3a2a-3409-4acc-8192-f4db952ccdcf") : secret "canary-serving-cert" not found Apr 16 14:55:11.719244 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:11.719008 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:55:11.719244 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:11.719081 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls podName:4bcbf238-7f5c-47ac-b42a-0e299ed29df0 nodeName:}" failed. No retries permitted until 2026-04-16 14:57:13.71906019 +0000 UTC m=+283.215387398 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls") pod "dns-default-8pv4t" (UID: "4bcbf238-7f5c-47ac-b42a-0e299ed29df0") : secret "dns-default-metrics-tls" not found Apr 16 14:55:16.352498 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:16.352466 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:16.352896 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:16.352549 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:16.352896 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:16.352620 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle podName:5b6bb918-ac04-4256-9242-dd810c7e754e nodeName:}" failed. No retries permitted until 2026-04-16 14:55:32.352602008 +0000 UTC m=+181.848929215 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle") pod "router-default-b9d79f856-c56q2" (UID: "5b6bb918-ac04-4256-9242-dd810c7e754e") : configmap references non-existent config key: service-ca.crt Apr 16 14:55:16.355059 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:16.355038 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b6bb918-ac04-4256-9242-dd810c7e754e-metrics-certs\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:16.654508 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:16.654481 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-5qmss\" (UID: \"bcdd05bc-1066-4ec8-b940-2a95cd94c623\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:16.656849 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:16.656813 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcdd05bc-1066-4ec8-b940-2a95cd94c623-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-5qmss\" (UID: \"bcdd05bc-1066-4ec8-b940-2a95cd94c623\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:16.955760 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:16.955688 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" Apr 16 14:55:17.070192 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:17.070166 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss"] Apr 16 14:55:17.673536 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:17.673488 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" event={"ID":"bcdd05bc-1066-4ec8-b940-2a95cd94c623","Type":"ContainerStarted","Data":"9af2b37bee1f758902869726eaf186af58e32b82d6b23ced2bcbdc30de78e73b"} Apr 16 14:55:18.678691 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:18.678658 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" event={"ID":"bcdd05bc-1066-4ec8-b940-2a95cd94c623","Type":"ContainerStarted","Data":"d6f512bfcbd3788c7fe0630b6bbf7075c33f64257a3bcf5d8f36b4b54a88c002"} Apr 16 14:55:18.679200 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:18.678701 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" event={"ID":"bcdd05bc-1066-4ec8-b940-2a95cd94c623","Type":"ContainerStarted","Data":"d09b303ef428f8d89326689f146110d6dfb5dfc97c8818d955522dadaf8216fb"} Apr 16 14:55:18.695335 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:18.695292 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-5qmss" podStartSLOduration=17.261780371 podStartE2EDuration="18.695275426s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:17.112538298 +0000 UTC m=+166.608865504" lastFinishedPulling="2026-04-16 14:55:18.546033156 +0000 UTC m=+168.042360559" observedRunningTime="2026-04-16 14:55:18.694724536 +0000 UTC m=+168.191051763" watchObservedRunningTime="2026-04-16 14:55:18.695275426 +0000 UTC m=+168.191602655" Apr 16 14:55:20.157960 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:20.157928 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:55:26.884035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.884001 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-mh42f"] Apr 16 14:55:26.889153 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.889130 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:26.891436 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.891418 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:26.892449 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.892433 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:26.892540 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.892491 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8pzvz\"" Apr 16 14:55:26.902430 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.902409 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mh42f"] Apr 16 14:55:26.929118 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.929081 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/141f077c-7b80-4110-8cea-be4a0d2339c7-crio-socket\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:26.929244 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.929125 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvgth\" (UniqueName: \"kubernetes.io/projected/141f077c-7b80-4110-8cea-be4a0d2339c7-kube-api-access-kvgth\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:26.929244 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.929208 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/141f077c-7b80-4110-8cea-be4a0d2339c7-data-volume\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:26.929416 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.929239 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/141f077c-7b80-4110-8cea-be4a0d2339c7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:26.929416 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.929286 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/141f077c-7b80-4110-8cea-be4a0d2339c7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:26.939559 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.939535 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-2brqv"] Apr 16 14:55:26.942706 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.942689 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-2brqv" Apr 16 14:55:26.945216 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.945194 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:55:26.945312 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.945271 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:55:26.945414 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.945391 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-2dqsx\"" Apr 16 14:55:26.954911 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:26.954892 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-2brqv"] Apr 16 14:55:27.030036 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.030010 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/141f077c-7b80-4110-8cea-be4a0d2339c7-crio-socket\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:27.030165 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.030044 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvgth\" (UniqueName: \"kubernetes.io/projected/141f077c-7b80-4110-8cea-be4a0d2339c7-kube-api-access-kvgth\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:27.030165 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.030116 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5pt6\" (UniqueName: \"kubernetes.io/projected/20a533bc-cfb1-4fba-afea-c9f0e370d07c-kube-api-access-s5pt6\") pod \"downloads-586b57c7b4-2brqv\" (UID: \"20a533bc-cfb1-4fba-afea-c9f0e370d07c\") " pod="openshift-console/downloads-586b57c7b4-2brqv" Apr 16 14:55:27.030165 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.030117 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/141f077c-7b80-4110-8cea-be4a0d2339c7-crio-socket\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:27.030291 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.030143 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/141f077c-7b80-4110-8cea-be4a0d2339c7-data-volume\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:27.030291 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.030261 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/141f077c-7b80-4110-8cea-be4a0d2339c7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:27.030389 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.030307 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/141f077c-7b80-4110-8cea-be4a0d2339c7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:27.030526 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.030435 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/141f077c-7b80-4110-8cea-be4a0d2339c7-data-volume\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:27.030765 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.030745 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/141f077c-7b80-4110-8cea-be4a0d2339c7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:27.032749 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.032728 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/141f077c-7b80-4110-8cea-be4a0d2339c7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:27.039667 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.039650 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvgth\" (UniqueName: \"kubernetes.io/projected/141f077c-7b80-4110-8cea-be4a0d2339c7-kube-api-access-kvgth\") pod \"insights-runtime-extractor-mh42f\" (UID: \"141f077c-7b80-4110-8cea-be4a0d2339c7\") " pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:27.130756 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.130734 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5pt6\" (UniqueName: \"kubernetes.io/projected/20a533bc-cfb1-4fba-afea-c9f0e370d07c-kube-api-access-s5pt6\") pod \"downloads-586b57c7b4-2brqv\" (UID: \"20a533bc-cfb1-4fba-afea-c9f0e370d07c\") " pod="openshift-console/downloads-586b57c7b4-2brqv" Apr 16 14:55:27.138294 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.138228 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5pt6\" (UniqueName: \"kubernetes.io/projected/20a533bc-cfb1-4fba-afea-c9f0e370d07c-kube-api-access-s5pt6\") pod \"downloads-586b57c7b4-2brqv\" (UID: \"20a533bc-cfb1-4fba-afea-c9f0e370d07c\") " pod="openshift-console/downloads-586b57c7b4-2brqv" Apr 16 14:55:27.197869 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.197851 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mh42f" Apr 16 14:55:27.250966 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.250940 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-2brqv" Apr 16 14:55:27.316995 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.316966 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mh42f"] Apr 16 14:55:27.321321 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:27.321289 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod141f077c_7b80_4110_8cea_be4a0d2339c7.slice/crio-2e32c3ed8af915230d310600034399186ca7c6e6e422e07c50a666143bb7530d WatchSource:0}: Error finding container 2e32c3ed8af915230d310600034399186ca7c6e6e422e07c50a666143bb7530d: Status 404 returned error can't find the container with id 2e32c3ed8af915230d310600034399186ca7c6e6e422e07c50a666143bb7530d Apr 16 14:55:27.377408 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.377384 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-2brqv"] Apr 16 14:55:27.380207 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:27.380181 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20a533bc_cfb1_4fba_afea_c9f0e370d07c.slice/crio-2dc6ba1115f13d72aec2457a31d76b016e1a7e2a3a225c2feff6003b7efcfb99 WatchSource:0}: Error finding container 2dc6ba1115f13d72aec2457a31d76b016e1a7e2a3a225c2feff6003b7efcfb99: Status 404 returned error can't find the container with id 2dc6ba1115f13d72aec2457a31d76b016e1a7e2a3a225c2feff6003b7efcfb99 Apr 16 14:55:27.701948 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.701915 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-2brqv" event={"ID":"20a533bc-cfb1-4fba-afea-c9f0e370d07c","Type":"ContainerStarted","Data":"2dc6ba1115f13d72aec2457a31d76b016e1a7e2a3a225c2feff6003b7efcfb99"} Apr 16 14:55:27.703433 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.703406 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mh42f" event={"ID":"141f077c-7b80-4110-8cea-be4a0d2339c7","Type":"ContainerStarted","Data":"987cf903cbb8ee9ad33dd26f88ce830af292c1d17d5d310a07af888b1c2a8cf5"} Apr 16 14:55:27.703433 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:27.703436 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mh42f" event={"ID":"141f077c-7b80-4110-8cea-be4a0d2339c7","Type":"ContainerStarted","Data":"2e32c3ed8af915230d310600034399186ca7c6e6e422e07c50a666143bb7530d"} Apr 16 14:55:28.709403 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:28.709359 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mh42f" event={"ID":"141f077c-7b80-4110-8cea-be4a0d2339c7","Type":"ContainerStarted","Data":"129217cb912e0ca9727f25b12b8e60da348e3f5478d4f79b4fa6ca67fa7d895c"} Apr 16 14:55:29.714844 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:29.714794 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mh42f" event={"ID":"141f077c-7b80-4110-8cea-be4a0d2339c7","Type":"ContainerStarted","Data":"bb7d7366a8f1324fa2264eb2af07a8b45c12982f842e4a0103fe5f7e8c4fa3d3"} Apr 16 14:55:29.733498 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:29.733443 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-mh42f" podStartSLOduration=1.625645054 podStartE2EDuration="3.733424523s" podCreationTimestamp="2026-04-16 14:55:26 +0000 UTC" firstStartedPulling="2026-04-16 14:55:27.379985358 +0000 UTC m=+176.876312572" lastFinishedPulling="2026-04-16 14:55:29.487764828 +0000 UTC m=+178.984092041" observedRunningTime="2026-04-16 14:55:29.732925719 +0000 UTC m=+179.229252952" watchObservedRunningTime="2026-04-16 14:55:29.733424523 +0000 UTC m=+179.229751753" Apr 16 14:55:32.370985 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:32.370943 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:32.371627 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:32.371603 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b6bb918-ac04-4256-9242-dd810c7e754e-service-ca-bundle\") pod \"router-default-b9d79f856-c56q2\" (UID: \"5b6bb918-ac04-4256-9242-dd810c7e754e\") " pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:32.653661 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:32.653631 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-46hmh\"" Apr 16 14:55:32.661568 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:32.661539 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:32.788884 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:32.788856 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b9d79f856-c56q2"] Apr 16 14:55:32.792151 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:32.792113 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b6bb918_ac04_4256_9242_dd810c7e754e.slice/crio-a5a7dce273348ddde5d83abdb437b91cb256cc40df55bc502c1041e85694aac8 WatchSource:0}: Error finding container a5a7dce273348ddde5d83abdb437b91cb256cc40df55bc502c1041e85694aac8: Status 404 returned error can't find the container with id a5a7dce273348ddde5d83abdb437b91cb256cc40df55bc502c1041e85694aac8 Apr 16 14:55:33.259249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.259219 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74cb84876-5gw67"] Apr 16 14:55:33.262474 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.262450 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.268683 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.268485 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8vlfx\"" Apr 16 14:55:33.268683 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.268539 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:55:33.269562 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.269536 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:55:33.269680 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.269539 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:55:33.269900 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.269736 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:55:33.269900 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.269752 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:55:33.275160 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.275139 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74cb84876-5gw67"] Apr 16 14:55:33.380801 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.380768 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-config\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.381210 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.380816 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-serving-cert\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.381210 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.380950 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpwdt\" (UniqueName: \"kubernetes.io/projected/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-kube-api-access-fpwdt\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.381210 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.381024 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-oauth-serving-cert\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.381210 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.381057 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-oauth-config\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.381210 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.381078 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-service-ca\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.481672 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.481634 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-oauth-config\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.481889 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.481681 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-service-ca\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.481889 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.481715 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-config\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.481889 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.481745 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-serving-cert\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.481889 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.481870 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpwdt\" (UniqueName: \"kubernetes.io/projected/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-kube-api-access-fpwdt\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.482110 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.481958 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-oauth-serving-cert\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.482722 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.482696 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-service-ca\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.482868 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.482851 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-oauth-serving-cert\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.482982 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.482961 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-config\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.484601 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.484580 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-serving-cert\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.484692 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.484653 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-oauth-config\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.504957 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.504937 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpwdt\" (UniqueName: \"kubernetes.io/projected/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-kube-api-access-fpwdt\") pod \"console-74cb84876-5gw67\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.573681 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.573606 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:33.713868 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.713838 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74cb84876-5gw67"] Apr 16 14:55:33.717497 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:33.717466 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f72564b_5bc1_45cd_a160_ee6db0d5d01e.slice/crio-0089cd630f89b3e5a957567af768edfe8b4978c1f1a5a3ae8fe93656fc86b7ae WatchSource:0}: Error finding container 0089cd630f89b3e5a957567af768edfe8b4978c1f1a5a3ae8fe93656fc86b7ae: Status 404 returned error can't find the container with id 0089cd630f89b3e5a957567af768edfe8b4978c1f1a5a3ae8fe93656fc86b7ae Apr 16 14:55:33.726964 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.726932 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74cb84876-5gw67" event={"ID":"5f72564b-5bc1-45cd-a160-ee6db0d5d01e","Type":"ContainerStarted","Data":"0089cd630f89b3e5a957567af768edfe8b4978c1f1a5a3ae8fe93656fc86b7ae"} Apr 16 14:55:33.728328 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.728293 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b9d79f856-c56q2" event={"ID":"5b6bb918-ac04-4256-9242-dd810c7e754e","Type":"ContainerStarted","Data":"2df59ee1ee4ee2cd9e202c1ceef5daa337b9b5b03df4800a9b1509455a5d1549"} Apr 16 14:55:33.728417 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.728330 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b9d79f856-c56q2" event={"ID":"5b6bb918-ac04-4256-9242-dd810c7e754e","Type":"ContainerStarted","Data":"a5a7dce273348ddde5d83abdb437b91cb256cc40df55bc502c1041e85694aac8"} Apr 16 14:55:33.751878 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:33.751808 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-b9d79f856-c56q2" podStartSLOduration=33.751795005 podStartE2EDuration="33.751795005s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:33.75045451 +0000 UTC m=+183.246781748" watchObservedRunningTime="2026-04-16 14:55:33.751795005 +0000 UTC m=+183.248122234" Apr 16 14:55:34.661917 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:34.661877 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:34.664883 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:34.664858 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:34.731503 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:34.731474 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:34.732947 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:34.732925 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-b9d79f856-c56q2" Apr 16 14:55:34.811688 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:34.810719 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5"] Apr 16 14:55:34.815789 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:34.815766 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5" Apr 16 14:55:34.818526 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:34.818505 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 14:55:34.818622 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:34.818594 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-8k4jb\"" Apr 16 14:55:34.823355 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:34.823330 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5"] Apr 16 14:55:34.893477 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:34.893449 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9945fe1b-a2b6-466f-87ce-1506f00c87fe-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-6j5d5\" (UID: \"9945fe1b-a2b6-466f-87ce-1506f00c87fe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5" Apr 16 14:55:34.993982 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:34.993903 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9945fe1b-a2b6-466f-87ce-1506f00c87fe-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-6j5d5\" (UID: \"9945fe1b-a2b6-466f-87ce-1506f00c87fe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5" Apr 16 14:55:34.994100 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:34.994026 2582 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 14:55:34.994161 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:34.994110 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9945fe1b-a2b6-466f-87ce-1506f00c87fe-tls-certificates podName:9945fe1b-a2b6-466f-87ce-1506f00c87fe nodeName:}" failed. No retries permitted until 2026-04-16 14:55:35.49408838 +0000 UTC m=+184.990415596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/9945fe1b-a2b6-466f-87ce-1506f00c87fe-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-6j5d5" (UID: "9945fe1b-a2b6-466f-87ce-1506f00c87fe") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 14:55:35.498246 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:35.498209 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9945fe1b-a2b6-466f-87ce-1506f00c87fe-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-6j5d5\" (UID: \"9945fe1b-a2b6-466f-87ce-1506f00c87fe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5" Apr 16 14:55:35.501370 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:35.501339 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9945fe1b-a2b6-466f-87ce-1506f00c87fe-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-6j5d5\" (UID: \"9945fe1b-a2b6-466f-87ce-1506f00c87fe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5" Apr 16 14:55:35.727615 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:35.727581 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5" Apr 16 14:55:39.233543 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.233506 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85f85cf4b6-7vnjp"] Apr 16 14:55:39.237902 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.237875 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.248123 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.248101 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:55:39.248394 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.248375 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85f85cf4b6-7vnjp"] Apr 16 14:55:39.329769 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.329733 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-serving-cert\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.329769 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.329771 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-oauth-serving-cert\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.330014 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.329881 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-trusted-ca-bundle\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.330014 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.329909 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-config\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.330014 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.329937 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-oauth-config\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.330143 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.330047 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-service-ca\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.330143 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.330089 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qm6d\" (UniqueName: \"kubernetes.io/projected/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-kube-api-access-6qm6d\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.430646 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.430613 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-serving-cert\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.430815 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.430651 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-oauth-serving-cert\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.430906 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.430809 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-trusted-ca-bundle\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.430906 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.430897 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-config\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.431016 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.430940 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-oauth-config\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.431016 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.430988 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-service-ca\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.431091 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.431016 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qm6d\" (UniqueName: \"kubernetes.io/projected/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-kube-api-access-6qm6d\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.431473 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.431417 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-oauth-serving-cert\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.431645 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.431621 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-trusted-ca-bundle\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.431731 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.431626 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-config\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.431984 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.431759 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-service-ca\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.433791 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.433759 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-oauth-config\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.434137 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.434115 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-serving-cert\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.439433 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.439392 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qm6d\" (UniqueName: \"kubernetes.io/projected/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-kube-api-access-6qm6d\") pod \"console-85f85cf4b6-7vnjp\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:39.550920 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:39.550844 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:43.749321 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:43.749050 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5"] Apr 16 14:55:43.753837 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:43.753789 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9945fe1b_a2b6_466f_87ce_1506f00c87fe.slice/crio-5e2c430069da92f91bdf227bf61b386e33e86909d7d5a2701ebc35a78287903d WatchSource:0}: Error finding container 5e2c430069da92f91bdf227bf61b386e33e86909d7d5a2701ebc35a78287903d: Status 404 returned error can't find the container with id 5e2c430069da92f91bdf227bf61b386e33e86909d7d5a2701ebc35a78287903d Apr 16 14:55:43.759269 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:43.759244 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5" event={"ID":"9945fe1b-a2b6-466f-87ce-1506f00c87fe","Type":"ContainerStarted","Data":"5e2c430069da92f91bdf227bf61b386e33e86909d7d5a2701ebc35a78287903d"} Apr 16 14:55:43.760761 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:43.760738 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74cb84876-5gw67" event={"ID":"5f72564b-5bc1-45cd-a160-ee6db0d5d01e","Type":"ContainerStarted","Data":"d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907"} Apr 16 14:55:43.772980 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:43.772959 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85f85cf4b6-7vnjp"] Apr 16 14:55:43.776131 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:43.776112 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9a1ac58_5e77_4d51_aa92_a72d754ecbc0.slice/crio-e452b1414bcdae817815b91b3c9e98dfe2d385abe440ea0b38bbe7c85b9f5657 WatchSource:0}: Error finding container e452b1414bcdae817815b91b3c9e98dfe2d385abe440ea0b38bbe7c85b9f5657: Status 404 returned error can't find the container with id e452b1414bcdae817815b91b3c9e98dfe2d385abe440ea0b38bbe7c85b9f5657 Apr 16 14:55:43.782725 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:43.782673 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74cb84876-5gw67" podStartSLOduration=0.881086837 podStartE2EDuration="10.782656121s" podCreationTimestamp="2026-04-16 14:55:33 +0000 UTC" firstStartedPulling="2026-04-16 14:55:33.719165447 +0000 UTC m=+183.215492660" lastFinishedPulling="2026-04-16 14:55:43.620734722 +0000 UTC m=+193.117061944" observedRunningTime="2026-04-16 14:55:43.781801496 +0000 UTC m=+193.278128728" watchObservedRunningTime="2026-04-16 14:55:43.782656121 +0000 UTC m=+193.278983352" Apr 16 14:55:44.765598 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:44.765564 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f85cf4b6-7vnjp" event={"ID":"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0","Type":"ContainerStarted","Data":"f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc"} Apr 16 14:55:44.765957 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:44.765605 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f85cf4b6-7vnjp" event={"ID":"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0","Type":"ContainerStarted","Data":"e452b1414bcdae817815b91b3c9e98dfe2d385abe440ea0b38bbe7c85b9f5657"} Apr 16 14:55:44.767070 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:44.767040 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-2brqv" event={"ID":"20a533bc-cfb1-4fba-afea-c9f0e370d07c","Type":"ContainerStarted","Data":"8baefb44b7f4680ffed8caf4e6113e2ae33232781bcdd6bafa78382469ff3fb3"} Apr 16 14:55:44.782261 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:44.782211 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85f85cf4b6-7vnjp" podStartSLOduration=5.782196024 podStartE2EDuration="5.782196024s" podCreationTimestamp="2026-04-16 14:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:44.781719983 +0000 UTC m=+194.278047213" watchObservedRunningTime="2026-04-16 14:55:44.782196024 +0000 UTC m=+194.278523253" Apr 16 14:55:44.800433 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:44.800379 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-2brqv" podStartSLOduration=2.445193755 podStartE2EDuration="18.800365481s" podCreationTimestamp="2026-04-16 14:55:26 +0000 UTC" firstStartedPulling="2026-04-16 14:55:27.382381013 +0000 UTC m=+176.878708221" lastFinishedPulling="2026-04-16 14:55:43.737552723 +0000 UTC m=+193.233879947" observedRunningTime="2026-04-16 14:55:44.798781615 +0000 UTC m=+194.295108856" watchObservedRunningTime="2026-04-16 14:55:44.800365481 +0000 UTC m=+194.296692710" Apr 16 14:55:45.771773 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:45.771723 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5" event={"ID":"9945fe1b-a2b6-466f-87ce-1506f00c87fe","Type":"ContainerStarted","Data":"a7d4cbb2d5438ecfebcb3d0ba580c62c5f7ca2a9122644a3832818d13b315c44"} Apr 16 14:55:45.772604 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:45.772573 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-2brqv" Apr 16 14:55:45.789473 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:45.789417 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5" podStartSLOduration=10.82798663 podStartE2EDuration="11.789398999s" podCreationTimestamp="2026-04-16 14:55:34 +0000 UTC" firstStartedPulling="2026-04-16 14:55:43.756116222 +0000 UTC m=+193.252443429" lastFinishedPulling="2026-04-16 14:55:44.717528587 +0000 UTC m=+194.213855798" observedRunningTime="2026-04-16 14:55:45.787861011 +0000 UTC m=+195.284188239" watchObservedRunningTime="2026-04-16 14:55:45.789398999 +0000 UTC m=+195.285726233" Apr 16 14:55:45.797364 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:45.797341 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-2brqv" Apr 16 14:55:46.774730 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:46.774693 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5" Apr 16 14:55:46.780014 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:46.779985 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-6j5d5" Apr 16 14:55:47.895581 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:47.895540 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-4jngz"] Apr 16 14:55:47.912819 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:47.912791 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-4jngz"] Apr 16 14:55:47.912985 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:47.912926 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:47.915493 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:47.915456 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 14:55:47.915493 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:47.915465 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:55:47.915677 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:47.915598 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:55:47.916713 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:47.916522 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qcks9\"" Apr 16 14:55:47.916799 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:47.916719 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:47.917031 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:47.917015 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 14:55:48.005089 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.005059 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0950cb6b-a1b8-4cad-bc60-0314e46475f0-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.005205 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.005109 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf9lm\" (UniqueName: \"kubernetes.io/projected/0950cb6b-a1b8-4cad-bc60-0314e46475f0-kube-api-access-cf9lm\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.005205 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.005141 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0950cb6b-a1b8-4cad-bc60-0314e46475f0-metrics-client-ca\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.005205 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.005182 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0950cb6b-a1b8-4cad-bc60-0314e46475f0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.106192 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.106166 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0950cb6b-a1b8-4cad-bc60-0314e46475f0-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.106322 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.106213 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cf9lm\" (UniqueName: \"kubernetes.io/projected/0950cb6b-a1b8-4cad-bc60-0314e46475f0-kube-api-access-cf9lm\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.106322 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.106237 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0950cb6b-a1b8-4cad-bc60-0314e46475f0-metrics-client-ca\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.106413 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:48.106316 2582 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 14:55:48.106413 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:48.106389 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0950cb6b-a1b8-4cad-bc60-0314e46475f0-prometheus-operator-tls podName:0950cb6b-a1b8-4cad-bc60-0314e46475f0 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:48.606368153 +0000 UTC m=+198.102695373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/0950cb6b-a1b8-4cad-bc60-0314e46475f0-prometheus-operator-tls") pod "prometheus-operator-78f957474d-4jngz" (UID: "0950cb6b-a1b8-4cad-bc60-0314e46475f0") : secret "prometheus-operator-tls" not found Apr 16 14:55:48.106480 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.106436 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0950cb6b-a1b8-4cad-bc60-0314e46475f0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.112872 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.112848 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0950cb6b-a1b8-4cad-bc60-0314e46475f0-metrics-client-ca\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.115028 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.115004 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0950cb6b-a1b8-4cad-bc60-0314e46475f0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.115414 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.115392 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf9lm\" (UniqueName: \"kubernetes.io/projected/0950cb6b-a1b8-4cad-bc60-0314e46475f0-kube-api-access-cf9lm\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.610749 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.610705 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0950cb6b-a1b8-4cad-bc60-0314e46475f0-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.613537 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.613510 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0950cb6b-a1b8-4cad-bc60-0314e46475f0-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-4jngz\" (UID: \"0950cb6b-a1b8-4cad-bc60-0314e46475f0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.823972 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.823936 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" Apr 16 14:55:48.956033 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:48.956003 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-4jngz"] Apr 16 14:55:48.960597 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:48.960558 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0950cb6b_a1b8_4cad_bc60_0314e46475f0.slice/crio-b7e12e396da45dab82373112ffd692067a3afe28d04908b46510a23545af1a3d WatchSource:0}: Error finding container b7e12e396da45dab82373112ffd692067a3afe28d04908b46510a23545af1a3d: Status 404 returned error can't find the container with id b7e12e396da45dab82373112ffd692067a3afe28d04908b46510a23545af1a3d Apr 16 14:55:49.551338 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:49.551299 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:49.551666 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:49.551621 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:49.557591 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:49.557568 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:49.786636 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:49.786596 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" event={"ID":"0950cb6b-a1b8-4cad-bc60-0314e46475f0","Type":"ContainerStarted","Data":"b7e12e396da45dab82373112ffd692067a3afe28d04908b46510a23545af1a3d"} Apr 16 14:55:49.791642 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:49.791615 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:55:49.837085 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:49.836865 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74cb84876-5gw67"] Apr 16 14:55:51.793093 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:51.793057 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" event={"ID":"0950cb6b-a1b8-4cad-bc60-0314e46475f0","Type":"ContainerStarted","Data":"49844a3674d993376ec46df2c57bc656c29aa5e08a3010c4f4344d73469fe76d"} Apr 16 14:55:51.793093 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:51.793098 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" event={"ID":"0950cb6b-a1b8-4cad-bc60-0314e46475f0","Type":"ContainerStarted","Data":"6018ff943be1cfc93517d9e913df248c324bd1d9b28db81cd9a5e69390efca38"} Apr 16 14:55:51.814774 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:51.814735 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-4jngz" podStartSLOduration=3.076800024 podStartE2EDuration="4.814721772s" podCreationTimestamp="2026-04-16 14:55:47 +0000 UTC" firstStartedPulling="2026-04-16 14:55:48.962817654 +0000 UTC m=+198.459144864" lastFinishedPulling="2026-04-16 14:55:50.700739398 +0000 UTC m=+200.197066612" observedRunningTime="2026-04-16 14:55:51.813988295 +0000 UTC m=+201.310315718" watchObservedRunningTime="2026-04-16 14:55:51.814721772 +0000 UTC m=+201.311048998" Apr 16 14:55:53.299651 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.299617 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-685x7"] Apr 16 14:55:53.348931 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.348902 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.353493 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.353471 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:53.354011 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.353989 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mkbd4\"" Apr 16 14:55:53.354249 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.354017 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:53.354462 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.354446 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:53.454736 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.454710 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-wtmp\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.454916 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.454742 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-textfile\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.454916 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.454760 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/22bd709d-b5a6-4831-ab98-87a2c0622bbe-root\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.454916 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.454846 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95wpb\" (UniqueName: \"kubernetes.io/projected/22bd709d-b5a6-4831-ab98-87a2c0622bbe-kube-api-access-95wpb\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.455089 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.454940 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.455089 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.455004 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22bd709d-b5a6-4831-ab98-87a2c0622bbe-sys\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.455089 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.455031 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-tls\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.455089 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.455053 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22bd709d-b5a6-4831-ab98-87a2c0622bbe-metrics-client-ca\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.455089 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.455084 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-accelerators-collector-config\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556393 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556327 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-wtmp\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556393 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556377 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-textfile\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556592 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556508 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/22bd709d-b5a6-4831-ab98-87a2c0622bbe-root\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556592 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556542 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-wtmp\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556592 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556580 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95wpb\" (UniqueName: \"kubernetes.io/projected/22bd709d-b5a6-4831-ab98-87a2c0622bbe-kube-api-access-95wpb\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556741 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556618 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/22bd709d-b5a6-4831-ab98-87a2c0622bbe-root\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556741 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556653 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556741 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556718 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22bd709d-b5a6-4831-ab98-87a2c0622bbe-sys\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556741 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556721 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-textfile\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556941 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556743 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-tls\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556941 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556767 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22bd709d-b5a6-4831-ab98-87a2c0622bbe-metrics-client-ca\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556941 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556774 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22bd709d-b5a6-4831-ab98-87a2c0622bbe-sys\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556941 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.556793 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-accelerators-collector-config\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.556941 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:53.556886 2582 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:55:53.557182 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:55:53.556949 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-tls podName:22bd709d-b5a6-4831-ab98-87a2c0622bbe nodeName:}" failed. No retries permitted until 2026-04-16 14:55:54.056927389 +0000 UTC m=+203.553254599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-tls") pod "node-exporter-685x7" (UID: "22bd709d-b5a6-4831-ab98-87a2c0622bbe") : secret "node-exporter-tls" not found Apr 16 14:55:53.557359 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.557340 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22bd709d-b5a6-4831-ab98-87a2c0622bbe-metrics-client-ca\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.557437 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.557382 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-accelerators-collector-config\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.559713 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.559692 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.567994 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.567968 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95wpb\" (UniqueName: \"kubernetes.io/projected/22bd709d-b5a6-4831-ab98-87a2c0622bbe-kube-api-access-95wpb\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:53.573802 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:53.573787 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:55:54.060049 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:54.060020 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-tls\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:54.062385 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:54.062368 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/22bd709d-b5a6-4831-ab98-87a2c0622bbe-node-exporter-tls\") pod \"node-exporter-685x7\" (UID: \"22bd709d-b5a6-4831-ab98-87a2c0622bbe\") " pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:54.259695 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:54.259663 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-685x7" Apr 16 14:55:54.269173 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:54.269150 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22bd709d_b5a6_4831_ab98_87a2c0622bbe.slice/crio-86582087ac875bc8534264b26f846f247f4f9272e59a294270def199b1e1bb82 WatchSource:0}: Error finding container 86582087ac875bc8534264b26f846f247f4f9272e59a294270def199b1e1bb82: Status 404 returned error can't find the container with id 86582087ac875bc8534264b26f846f247f4f9272e59a294270def199b1e1bb82 Apr 16 14:55:54.803687 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:54.803650 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-685x7" event={"ID":"22bd709d-b5a6-4831-ab98-87a2c0622bbe","Type":"ContainerStarted","Data":"86582087ac875bc8534264b26f846f247f4f9272e59a294270def199b1e1bb82"} Apr 16 14:55:55.808170 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:55.808139 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-685x7" event={"ID":"22bd709d-b5a6-4831-ab98-87a2c0622bbe","Type":"ContainerStarted","Data":"5e63c095e726cc1bdb41b4c3ee1f273fd8fdf5274e6cc96d368d56444646262b"} Apr 16 14:55:56.811935 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:56.811899 2582 generic.go:358] "Generic (PLEG): container finished" podID="22bd709d-b5a6-4831-ab98-87a2c0622bbe" containerID="5e63c095e726cc1bdb41b4c3ee1f273fd8fdf5274e6cc96d368d56444646262b" exitCode=0 Apr 16 14:55:56.812272 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:56.811948 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-685x7" event={"ID":"22bd709d-b5a6-4831-ab98-87a2c0622bbe","Type":"ContainerDied","Data":"5e63c095e726cc1bdb41b4c3ee1f273fd8fdf5274e6cc96d368d56444646262b"} Apr 16 14:55:57.817256 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:57.817221 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-685x7" event={"ID":"22bd709d-b5a6-4831-ab98-87a2c0622bbe","Type":"ContainerStarted","Data":"12879ed1ca20c7f8f137a3cbbbc0fedcb91170d138e3f07380d1885002f78c56"} Apr 16 14:55:57.817683 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:57.817260 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-685x7" event={"ID":"22bd709d-b5a6-4831-ab98-87a2c0622bbe","Type":"ContainerStarted","Data":"e995e08984875d01a48448762a79407c80922cff6161d6caa8d288edaf041990"} Apr 16 14:55:57.838886 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:57.838846 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-685x7" podStartSLOduration=3.517867689 podStartE2EDuration="4.83880915s" podCreationTimestamp="2026-04-16 14:55:53 +0000 UTC" firstStartedPulling="2026-04-16 14:55:54.270920266 +0000 UTC m=+203.767247473" lastFinishedPulling="2026-04-16 14:55:55.591861722 +0000 UTC m=+205.088188934" observedRunningTime="2026-04-16 14:55:57.837599761 +0000 UTC m=+207.333926989" watchObservedRunningTime="2026-04-16 14:55:57.83880915 +0000 UTC m=+207.335136378" Apr 16 14:55:59.475750 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.475711 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:59.496353 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.496318 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:59.496785 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.496766 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.499549 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.499522 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:55:59.499662 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.499558 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:55:59.499865 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.499845 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:55:59.500087 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.500070 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:55:59.500231 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.500214 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:55:59.500301 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.500246 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:55:59.501015 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.500815 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:55:59.501015 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.500846 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:55:59.501187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.501061 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:55:59.501187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.501064 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:55:59.501295 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.501202 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-kmppm\"" Apr 16 14:55:59.501745 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.501724 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4r50kji7j1hd5\"" Apr 16 14:55:59.501864 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.501840 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:55:59.506100 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.505807 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:55:59.507696 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.507675 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:55:59.606597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606567 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-config\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.606597 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606601 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.606816 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606661 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.606816 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606681 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.606816 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606701 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.606816 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606716 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-web-config\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.606816 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606734 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.607031 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606849 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.607031 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606871 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.607031 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606889 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.607031 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606904 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c980a658-7fcd-4639-b0a5-28908f804d8a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.607031 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.606997 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c980a658-7fcd-4639-b0a5-28908f804d8a-config-out\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.607211 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.607053 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c980a658-7fcd-4639-b0a5-28908f804d8a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.607211 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.607082 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.607211 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.607106 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdc6h\" (UniqueName: \"kubernetes.io/projected/c980a658-7fcd-4639-b0a5-28908f804d8a-kube-api-access-xdc6h\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.607211 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.607140 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.607211 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.607202 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.607368 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.607252 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708158 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708122 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708158 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708160 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c980a658-7fcd-4639-b0a5-28908f804d8a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708189 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c980a658-7fcd-4639-b0a5-28908f804d8a-config-out\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708321 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c980a658-7fcd-4639-b0a5-28908f804d8a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708397 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708382 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708588 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708424 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdc6h\" (UniqueName: \"kubernetes.io/projected/c980a658-7fcd-4639-b0a5-28908f804d8a-kube-api-access-xdc6h\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708588 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708459 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708588 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708504 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708588 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708545 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708588 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708574 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-config\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708866 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708603 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708866 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708619 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c980a658-7fcd-4639-b0a5-28908f804d8a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708866 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708696 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708866 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708730 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708866 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708755 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708866 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708784 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-web-config\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.708866 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708819 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.709384 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708902 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.709384 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.708929 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.709794 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.709766 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.710015 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.709872 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.710891 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.710868 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.711441 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.711337 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c980a658-7fcd-4639-b0a5-28908f804d8a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.712304 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.711917 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.712304 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.712193 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.712937 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.712583 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.713376 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.713352 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-config\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.714085 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.714039 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.714449 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.714401 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.715722 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.715694 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-web-config\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.716418 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.716389 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.717151 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.717128 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c980a658-7fcd-4639-b0a5-28908f804d8a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.717254 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.717190 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdc6h\" (UniqueName: \"kubernetes.io/projected/c980a658-7fcd-4639-b0a5-28908f804d8a-kube-api-access-xdc6h\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.717369 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.717353 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.717470 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.717450 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c980a658-7fcd-4639-b0a5-28908f804d8a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.726028 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.725979 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c980a658-7fcd-4639-b0a5-28908f804d8a-config-out\") pod \"prometheus-k8s-0\" (UID: \"c980a658-7fcd-4639-b0a5-28908f804d8a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.808103 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.808068 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:59.955703 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:55:59.955669 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:59.959157 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:55:59.959126 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc980a658_7fcd_4639_b0a5_28908f804d8a.slice/crio-795af4db736c3a965a4d8cafca2f544b333d317a628ed4f0cbc279c9b02a5b13 WatchSource:0}: Error finding container 795af4db736c3a965a4d8cafca2f544b333d317a628ed4f0cbc279c9b02a5b13: Status 404 returned error can't find the container with id 795af4db736c3a965a4d8cafca2f544b333d317a628ed4f0cbc279c9b02a5b13 Apr 16 14:56:00.828404 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:00.828360 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c980a658-7fcd-4639-b0a5-28908f804d8a","Type":"ContainerStarted","Data":"795af4db736c3a965a4d8cafca2f544b333d317a628ed4f0cbc279c9b02a5b13"} Apr 16 14:56:01.832304 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:01.832239 2582 generic.go:358] "Generic (PLEG): container finished" podID="c980a658-7fcd-4639-b0a5-28908f804d8a" containerID="1f14f1c46dcef1914441a5764617c1551fbf7069474443f0b2ecb072a2242fce" exitCode=0 Apr 16 14:56:01.832596 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:01.832330 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c980a658-7fcd-4639-b0a5-28908f804d8a","Type":"ContainerDied","Data":"1f14f1c46dcef1914441a5764617c1551fbf7069474443f0b2ecb072a2242fce"} Apr 16 14:56:05.844976 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:05.844945 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c980a658-7fcd-4639-b0a5-28908f804d8a","Type":"ContainerStarted","Data":"2890efe1d3d0bf848cd560c8085d9b97b2764fff094db9813c5a35c1ba8836a3"} Apr 16 14:56:05.845472 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:05.844983 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c980a658-7fcd-4639-b0a5-28908f804d8a","Type":"ContainerStarted","Data":"e584df00f07d95b88a4d826c5605a874f3ff88a6f3089553a16be95b684bd5f3"} Apr 16 14:56:06.441622 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.441585 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5668486f94-lktk2"] Apr 16 14:56:06.466551 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.466526 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5668486f94-lktk2"] Apr 16 14:56:06.466733 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.466657 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.566707 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.566668 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-serving-cert\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.566896 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.566717 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-console-config\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.566896 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.566772 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-trusted-ca-bundle\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.567023 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.566895 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-service-ca\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.567023 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.566942 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-oauth-serving-cert\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.567023 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.567000 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvxn4\" (UniqueName: \"kubernetes.io/projected/acbdb581-cce6-4f09-a211-90e985f568f8-kube-api-access-fvxn4\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.567122 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.567024 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-oauth-config\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.667818 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.667783 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-service-ca\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.667818 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.667848 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-oauth-serving-cert\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.668078 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.667882 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvxn4\" (UniqueName: \"kubernetes.io/projected/acbdb581-cce6-4f09-a211-90e985f568f8-kube-api-access-fvxn4\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.668078 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.667913 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-oauth-config\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.668078 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.667993 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-serving-cert\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.668292 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.668261 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-console-config\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.668424 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.668310 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-trusted-ca-bundle\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.668644 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.668596 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-service-ca\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.668971 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.668947 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-console-config\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.669164 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.669143 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-oauth-serving-cert\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.670801 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.670782 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-serving-cert\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.671187 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.671164 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-oauth-config\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.676063 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.676027 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvxn4\" (UniqueName: \"kubernetes.io/projected/acbdb581-cce6-4f09-a211-90e985f568f8-kube-api-access-fvxn4\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.680620 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.680597 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-trusted-ca-bundle\") pod \"console-5668486f94-lktk2\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.778974 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.778786 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:06.984208 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:06.984184 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5668486f94-lktk2"] Apr 16 14:56:06.986539 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:56:06.986507 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacbdb581_cce6_4f09_a211_90e985f568f8.slice/crio-9f4239f1ea319ff3557399c97e7ffa6187180c92c838e3a571ad1f7701618050 WatchSource:0}: Error finding container 9f4239f1ea319ff3557399c97e7ffa6187180c92c838e3a571ad1f7701618050: Status 404 returned error can't find the container with id 9f4239f1ea319ff3557399c97e7ffa6187180c92c838e3a571ad1f7701618050 Apr 16 14:56:07.852358 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:07.852284 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5668486f94-lktk2" event={"ID":"acbdb581-cce6-4f09-a211-90e985f568f8","Type":"ContainerStarted","Data":"ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797"} Apr 16 14:56:07.852358 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:07.852317 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5668486f94-lktk2" event={"ID":"acbdb581-cce6-4f09-a211-90e985f568f8","Type":"ContainerStarted","Data":"9f4239f1ea319ff3557399c97e7ffa6187180c92c838e3a571ad1f7701618050"} Apr 16 14:56:07.859522 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:07.859491 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c980a658-7fcd-4639-b0a5-28908f804d8a","Type":"ContainerStarted","Data":"d5d46a5fa593438ba0edb88e00badb3b301b9ac0645b4b73eef7750a000250ec"} Apr 16 14:56:07.859522 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:07.859519 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c980a658-7fcd-4639-b0a5-28908f804d8a","Type":"ContainerStarted","Data":"d0bb6e8da1b60813a1e8554542b9f415b7e251e348b98f2b467f4db788a52a61"} Apr 16 14:56:07.859682 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:07.859528 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c980a658-7fcd-4639-b0a5-28908f804d8a","Type":"ContainerStarted","Data":"44637609f57f9a6a3256df61bd7a79b128ca6e1d3600071ee5efa3c88bf55acc"} Apr 16 14:56:07.859682 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:07.859535 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c980a658-7fcd-4639-b0a5-28908f804d8a","Type":"ContainerStarted","Data":"ff4561eaab528806321df081a5a4f2902d814385994b6b8bd497a1f8d04b04ac"} Apr 16 14:56:07.870776 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:07.870734 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5668486f94-lktk2" podStartSLOduration=1.870720892 podStartE2EDuration="1.870720892s" podCreationTimestamp="2026-04-16 14:56:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:56:07.870033664 +0000 UTC m=+217.366360894" watchObservedRunningTime="2026-04-16 14:56:07.870720892 +0000 UTC m=+217.367048121" Apr 16 14:56:07.898896 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:07.898858 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.564846727 podStartE2EDuration="8.898845788s" podCreationTimestamp="2026-04-16 14:55:59 +0000 UTC" firstStartedPulling="2026-04-16 14:55:59.961237973 +0000 UTC m=+209.457565180" lastFinishedPulling="2026-04-16 14:56:07.295237032 +0000 UTC m=+216.791564241" observedRunningTime="2026-04-16 14:56:07.896462348 +0000 UTC m=+217.392789604" watchObservedRunningTime="2026-04-16 14:56:07.898845788 +0000 UTC m=+217.395173013" Apr 16 14:56:09.808637 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:09.808609 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:14.861953 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:14.861891 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74cb84876-5gw67" podUID="5f72564b-5bc1-45cd-a160-ee6db0d5d01e" containerName="console" containerID="cri-o://d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907" gracePeriod=15 Apr 16 14:56:15.132180 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.132157 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74cb84876-5gw67_5f72564b-5bc1-45cd-a160-ee6db0d5d01e/console/0.log" Apr 16 14:56:15.132272 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.132219 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:56:15.132487 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.132468 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-config\") pod \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " Apr 16 14:56:15.132569 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.132504 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-serving-cert\") pod \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " Apr 16 14:56:15.132569 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.132531 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpwdt\" (UniqueName: \"kubernetes.io/projected/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-kube-api-access-fpwdt\") pod \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " Apr 16 14:56:15.132673 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.132568 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-oauth-config\") pod \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " Apr 16 14:56:15.132673 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.132605 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-service-ca\") pod \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " Apr 16 14:56:15.132766 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.132696 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-oauth-serving-cert\") pod \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\" (UID: \"5f72564b-5bc1-45cd-a160-ee6db0d5d01e\") " Apr 16 14:56:15.132814 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.132774 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-config" (OuterVolumeSpecName: "console-config") pod "5f72564b-5bc1-45cd-a160-ee6db0d5d01e" (UID: "5f72564b-5bc1-45cd-a160-ee6db0d5d01e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:15.133066 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.133041 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-service-ca" (OuterVolumeSpecName: "service-ca") pod "5f72564b-5bc1-45cd-a160-ee6db0d5d01e" (UID: "5f72564b-5bc1-45cd-a160-ee6db0d5d01e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:15.133144 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.133081 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5f72564b-5bc1-45cd-a160-ee6db0d5d01e" (UID: "5f72564b-5bc1-45cd-a160-ee6db0d5d01e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:15.133347 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.133170 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-oauth-serving-cert\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:15.133347 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.133191 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-config\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:15.133347 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.133204 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-service-ca\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:15.134871 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.134848 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5f72564b-5bc1-45cd-a160-ee6db0d5d01e" (UID: "5f72564b-5bc1-45cd-a160-ee6db0d5d01e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:15.135093 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.135077 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-kube-api-access-fpwdt" (OuterVolumeSpecName: "kube-api-access-fpwdt") pod "5f72564b-5bc1-45cd-a160-ee6db0d5d01e" (UID: "5f72564b-5bc1-45cd-a160-ee6db0d5d01e"). InnerVolumeSpecName "kube-api-access-fpwdt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:56:15.135299 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.135277 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5f72564b-5bc1-45cd-a160-ee6db0d5d01e" (UID: "5f72564b-5bc1-45cd-a160-ee6db0d5d01e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:15.234101 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.234077 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-serving-cert\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:15.234101 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.234099 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fpwdt\" (UniqueName: \"kubernetes.io/projected/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-kube-api-access-fpwdt\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:15.234238 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.234108 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f72564b-5bc1-45cd-a160-ee6db0d5d01e-console-oauth-config\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:15.882714 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.882687 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74cb84876-5gw67_5f72564b-5bc1-45cd-a160-ee6db0d5d01e/console/0.log" Apr 16 14:56:15.883091 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.882725 2582 generic.go:358] "Generic (PLEG): container finished" podID="5f72564b-5bc1-45cd-a160-ee6db0d5d01e" containerID="d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907" exitCode=2 Apr 16 14:56:15.883091 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.882787 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74cb84876-5gw67" event={"ID":"5f72564b-5bc1-45cd-a160-ee6db0d5d01e","Type":"ContainerDied","Data":"d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907"} Apr 16 14:56:15.883091 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.882812 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74cb84876-5gw67" event={"ID":"5f72564b-5bc1-45cd-a160-ee6db0d5d01e","Type":"ContainerDied","Data":"0089cd630f89b3e5a957567af768edfe8b4978c1f1a5a3ae8fe93656fc86b7ae"} Apr 16 14:56:15.883091 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.882848 2582 scope.go:117] "RemoveContainer" containerID="d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907" Apr 16 14:56:15.883091 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.882790 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74cb84876-5gw67" Apr 16 14:56:15.890579 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.890562 2582 scope.go:117] "RemoveContainer" containerID="d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907" Apr 16 14:56:15.890862 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:56:15.890843 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907\": container with ID starting with d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907 not found: ID does not exist" containerID="d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907" Apr 16 14:56:15.890924 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.890871 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907"} err="failed to get container status \"d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907\": rpc error: code = NotFound desc = could not find container \"d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907\": container with ID starting with d96a30e8018c16b909049ce3fa7bac9a71d2e85b9faf678975ae73405cad4907 not found: ID does not exist" Apr 16 14:56:15.897869 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.897840 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74cb84876-5gw67"] Apr 16 14:56:15.905602 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:15.901779 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74cb84876-5gw67"] Apr 16 14:56:16.779900 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:16.779875 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:16.779900 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:16.779903 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:16.784341 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:16.784322 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:16.890035 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:16.890016 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:56:16.934314 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:16.934287 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85f85cf4b6-7vnjp"] Apr 16 14:56:17.162537 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:17.162507 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f72564b-5bc1-45cd-a160-ee6db0d5d01e" path="/var/lib/kubelet/pods/5f72564b-5bc1-45cd-a160-ee6db0d5d01e/volumes" Apr 16 14:56:23.906631 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:23.906605 2582 generic.go:358] "Generic (PLEG): container finished" podID="605f32ed-53b6-48df-8568-937ded360dd9" containerID="ad8381ada1bc971c8a4580b61d84753d4041a56dbab2d9682ede92285305119a" exitCode=0 Apr 16 14:56:23.906934 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:23.906680 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" event={"ID":"605f32ed-53b6-48df-8568-937ded360dd9","Type":"ContainerDied","Data":"ad8381ada1bc971c8a4580b61d84753d4041a56dbab2d9682ede92285305119a"} Apr 16 14:56:23.906988 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:23.906975 2582 scope.go:117] "RemoveContainer" containerID="ad8381ada1bc971c8a4580b61d84753d4041a56dbab2d9682ede92285305119a" Apr 16 14:56:24.910963 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:24.910929 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-n7xxg" event={"ID":"605f32ed-53b6-48df-8568-937ded360dd9","Type":"ContainerStarted","Data":"bae4356ed966d847ae4f06d23246233e7b4c9d7c185ea2df0f6a0ad1d751e8bf"} Apr 16 14:56:24.929676 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:24.929652 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-z8k77_d26e8a35-54b1-4862-87ad-cab47e12e62d/dns-node-resolver/0.log" Apr 16 14:56:28.923304 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:28.923269 2582 generic.go:358] "Generic (PLEG): container finished" podID="d385c18d-5b59-4b98-a975-22594471a5b7" containerID="b67165c98d059c2de3250de59c75d6eaf0e2dfa3d923a34cb43a6cef93432b8d" exitCode=0 Apr 16 14:56:28.923648 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:28.923322 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" event={"ID":"d385c18d-5b59-4b98-a975-22594471a5b7","Type":"ContainerDied","Data":"b67165c98d059c2de3250de59c75d6eaf0e2dfa3d923a34cb43a6cef93432b8d"} Apr 16 14:56:28.923648 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:28.923603 2582 scope.go:117] "RemoveContainer" containerID="b67165c98d059c2de3250de59c75d6eaf0e2dfa3d923a34cb43a6cef93432b8d" Apr 16 14:56:29.927913 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:29.927877 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-647ss" event={"ID":"d385c18d-5b59-4b98-a975-22594471a5b7","Type":"ContainerStarted","Data":"31286862600b0ae8b9b95185863add42b1a857e399d50a84ae14151d4c416ab9"} Apr 16 14:56:41.958955 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:41.958897 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-85f85cf4b6-7vnjp" podUID="a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" containerName="console" containerID="cri-o://f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc" gracePeriod=15 Apr 16 14:56:42.199974 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.199955 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85f85cf4b6-7vnjp_a9a1ac58-5e77-4d51-aa92-a72d754ecbc0/console/0.log" Apr 16 14:56:42.200081 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.200010 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:56:42.252062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.251999 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-serving-cert\") pod \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " Apr 16 14:56:42.252062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.252027 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-oauth-config\") pod \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " Apr 16 14:56:42.252062 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.252052 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-oauth-serving-cert\") pod \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " Apr 16 14:56:42.252268 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.252088 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-config\") pod \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " Apr 16 14:56:42.252268 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.252123 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qm6d\" (UniqueName: \"kubernetes.io/projected/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-kube-api-access-6qm6d\") pod \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " Apr 16 14:56:42.252268 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.252160 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-trusted-ca-bundle\") pod \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " Apr 16 14:56:42.252268 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.252184 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-service-ca\") pod \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\" (UID: \"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0\") " Apr 16 14:56:42.252482 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.252439 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" (UID: "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:42.252677 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.252626 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-oauth-serving-cert\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:42.252677 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.252598 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-config" (OuterVolumeSpecName: "console-config") pod "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" (UID: "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:42.252862 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.252680 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" (UID: "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:42.252977 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.252955 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-service-ca" (OuterVolumeSpecName: "service-ca") pod "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" (UID: "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:42.254277 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.254247 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" (UID: "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:42.254352 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.254294 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-kube-api-access-6qm6d" (OuterVolumeSpecName: "kube-api-access-6qm6d") pod "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" (UID: "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0"). InnerVolumeSpecName "kube-api-access-6qm6d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:56:42.254352 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.254303 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" (UID: "a9a1ac58-5e77-4d51-aa92-a72d754ecbc0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:42.353387 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.353360 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-config\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:42.353387 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.353383 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6qm6d\" (UniqueName: \"kubernetes.io/projected/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-kube-api-access-6qm6d\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:42.353530 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.353398 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-trusted-ca-bundle\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:42.353530 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.353412 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-service-ca\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:42.353530 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.353426 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-serving-cert\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:42.353530 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.353440 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0-console-oauth-config\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:56:42.857500 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.857458 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:56:42.859811 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.859785 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e-metrics-certs\") pod \"network-metrics-daemon-6ltjv\" (UID: \"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e\") " pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:56:42.961382 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.961351 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kc8hc\"" Apr 16 14:56:42.966204 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.966186 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85f85cf4b6-7vnjp_a9a1ac58-5e77-4d51-aa92-a72d754ecbc0/console/0.log" Apr 16 14:56:42.966308 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.966228 2582 generic.go:358] "Generic (PLEG): container finished" podID="a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" containerID="f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc" exitCode=2 Apr 16 14:56:42.966308 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.966295 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f85cf4b6-7vnjp" Apr 16 14:56:42.966381 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.966319 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f85cf4b6-7vnjp" event={"ID":"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0","Type":"ContainerDied","Data":"f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc"} Apr 16 14:56:42.966381 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.966357 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f85cf4b6-7vnjp" event={"ID":"a9a1ac58-5e77-4d51-aa92-a72d754ecbc0","Type":"ContainerDied","Data":"e452b1414bcdae817815b91b3c9e98dfe2d385abe440ea0b38bbe7c85b9f5657"} Apr 16 14:56:42.966443 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.966379 2582 scope.go:117] "RemoveContainer" containerID="f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc" Apr 16 14:56:42.969060 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.969045 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6ltjv" Apr 16 14:56:42.977948 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.977930 2582 scope.go:117] "RemoveContainer" containerID="f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc" Apr 16 14:56:42.978226 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:56:42.978206 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc\": container with ID starting with f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc not found: ID does not exist" containerID="f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc" Apr 16 14:56:42.978285 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:42.978236 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc"} err="failed to get container status \"f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc\": rpc error: code = NotFound desc = could not find container \"f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc\": container with ID starting with f003c384619a94ed334a68f4b2f685e2ef9a3e718dabbb072cf45450b955a9dc not found: ID does not exist" Apr 16 14:56:43.000641 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:43.000617 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85f85cf4b6-7vnjp"] Apr 16 14:56:43.003801 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:43.003779 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85f85cf4b6-7vnjp"] Apr 16 14:56:43.090527 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:43.090500 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6ltjv"] Apr 16 14:56:43.093065 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:56:43.093029 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3062c2d2_b0a0_4572_ab0f_7dfd8c8df85e.slice/crio-28e34cfdea896248957363055027fa43aa45462a2c220c48081d2274e561ec9f WatchSource:0}: Error finding container 28e34cfdea896248957363055027fa43aa45462a2c220c48081d2274e561ec9f: Status 404 returned error can't find the container with id 28e34cfdea896248957363055027fa43aa45462a2c220c48081d2274e561ec9f Apr 16 14:56:43.161404 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:43.161372 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" path="/var/lib/kubelet/pods/a9a1ac58-5e77-4d51-aa92-a72d754ecbc0/volumes" Apr 16 14:56:43.971463 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:43.971424 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6ltjv" event={"ID":"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e","Type":"ContainerStarted","Data":"28e34cfdea896248957363055027fa43aa45462a2c220c48081d2274e561ec9f"} Apr 16 14:56:44.976534 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:44.976496 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6ltjv" event={"ID":"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e","Type":"ContainerStarted","Data":"e8a8c5cdc1738a37c884784b396c71fce500f90b917da0e4448cced2046f8704"} Apr 16 14:56:44.976534 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:44.976534 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6ltjv" event={"ID":"3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e","Type":"ContainerStarted","Data":"a1ae0229cecebddab9bb032b90a3ed3aa956a87022bd9091e17992ed3544b130"} Apr 16 14:56:44.994160 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:44.994117 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6ltjv" podStartSLOduration=253.045227329 podStartE2EDuration="4m13.99410292s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:56:43.09504883 +0000 UTC m=+252.591376037" lastFinishedPulling="2026-04-16 14:56:44.043924417 +0000 UTC m=+253.540251628" observedRunningTime="2026-04-16 14:56:44.992036739 +0000 UTC m=+254.488363967" watchObservedRunningTime="2026-04-16 14:56:44.99410292 +0000 UTC m=+254.490430198" Apr 16 14:56:59.809270 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:59.809237 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:59.825492 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:56:59.825471 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:00.035799 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:00.035765 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:10.646808 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:57:10.646765 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8pv4t" podUID="4bcbf238-7f5c-47ac-b42a-0e299ed29df0" Apr 16 14:57:10.646808 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:57:10.646788 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jh4sv" podUID="87ff3a2a-3409-4acc-8192-f4db952ccdcf" Apr 16 14:57:11.051347 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:11.051272 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:57:11.051497 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:11.051275 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8pv4t" Apr 16 14:57:13.795791 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:13.795756 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:57:13.796283 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:13.795809 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:57:13.798264 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:13.798236 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bcbf238-7f5c-47ac-b42a-0e299ed29df0-metrics-tls\") pod \"dns-default-8pv4t\" (UID: \"4bcbf238-7f5c-47ac-b42a-0e299ed29df0\") " pod="openshift-dns/dns-default-8pv4t" Apr 16 14:57:13.798391 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:13.798370 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87ff3a2a-3409-4acc-8192-f4db952ccdcf-cert\") pod \"ingress-canary-jh4sv\" (UID: \"87ff3a2a-3409-4acc-8192-f4db952ccdcf\") " pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:57:14.055661 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:14.055590 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vzfs6\"" Apr 16 14:57:14.055661 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:14.055590 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8jkdg\"" Apr 16 14:57:14.063387 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:14.063363 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jh4sv" Apr 16 14:57:14.063513 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:14.063387 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8pv4t" Apr 16 14:57:14.196763 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:14.196738 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jh4sv"] Apr 16 14:57:14.199279 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:57:14.199242 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87ff3a2a_3409_4acc_8192_f4db952ccdcf.slice/crio-921da9e5dd1aeaac26d6df04c918f973743913f214b86f6d3f09ed5d6b38d84d WatchSource:0}: Error finding container 921da9e5dd1aeaac26d6df04c918f973743913f214b86f6d3f09ed5d6b38d84d: Status 404 returned error can't find the container with id 921da9e5dd1aeaac26d6df04c918f973743913f214b86f6d3f09ed5d6b38d84d Apr 16 14:57:14.213066 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:14.212984 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8pv4t"] Apr 16 14:57:14.215316 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:57:14.215282 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bcbf238_7f5c_47ac_b42a_0e299ed29df0.slice/crio-8c12c45b8e74fbdd83a40a53cb09f4970db4ddbefbeaed527c3157fe49910420 WatchSource:0}: Error finding container 8c12c45b8e74fbdd83a40a53cb09f4970db4ddbefbeaed527c3157fe49910420: Status 404 returned error can't find the container with id 8c12c45b8e74fbdd83a40a53cb09f4970db4ddbefbeaed527c3157fe49910420 Apr 16 14:57:15.063869 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:15.063797 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8pv4t" event={"ID":"4bcbf238-7f5c-47ac-b42a-0e299ed29df0","Type":"ContainerStarted","Data":"8c12c45b8e74fbdd83a40a53cb09f4970db4ddbefbeaed527c3157fe49910420"} Apr 16 14:57:15.065384 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:15.065342 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jh4sv" event={"ID":"87ff3a2a-3409-4acc-8192-f4db952ccdcf","Type":"ContainerStarted","Data":"921da9e5dd1aeaac26d6df04c918f973743913f214b86f6d3f09ed5d6b38d84d"} Apr 16 14:57:17.077473 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:17.077428 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jh4sv" event={"ID":"87ff3a2a-3409-4acc-8192-f4db952ccdcf","Type":"ContainerStarted","Data":"0c16da39145b38f3c57c09bdf2b78040a2e6091ee14e81796164e325be404bbc"} Apr 16 14:57:17.078977 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:17.078954 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8pv4t" event={"ID":"4bcbf238-7f5c-47ac-b42a-0e299ed29df0","Type":"ContainerStarted","Data":"b3e60f1cc03e75fdb981f9faf8a734a881d87d70f47a5e9494c947600315ca3d"} Apr 16 14:57:17.078977 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:17.078980 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8pv4t" event={"ID":"4bcbf238-7f5c-47ac-b42a-0e299ed29df0","Type":"ContainerStarted","Data":"3233c8bfcb27822331f780a7eab69b4acce043afe8a542830209846af9e33be0"} Apr 16 14:57:17.079120 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:17.079093 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8pv4t" Apr 16 14:57:17.093449 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:17.093411 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jh4sv" podStartSLOduration=252.196567687 podStartE2EDuration="4m14.09339821s" podCreationTimestamp="2026-04-16 14:53:03 +0000 UTC" firstStartedPulling="2026-04-16 14:57:14.201233807 +0000 UTC m=+283.697561014" lastFinishedPulling="2026-04-16 14:57:16.098064326 +0000 UTC m=+285.594391537" observedRunningTime="2026-04-16 14:57:17.092289906 +0000 UTC m=+286.588617133" watchObservedRunningTime="2026-04-16 14:57:17.09339821 +0000 UTC m=+286.589725441" Apr 16 14:57:17.109315 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:17.109278 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8pv4t" podStartSLOduration=252.232145133 podStartE2EDuration="4m14.109266978s" podCreationTimestamp="2026-04-16 14:53:03 +0000 UTC" firstStartedPulling="2026-04-16 14:57:14.21695464 +0000 UTC m=+283.713281847" lastFinishedPulling="2026-04-16 14:57:16.094076479 +0000 UTC m=+285.590403692" observedRunningTime="2026-04-16 14:57:17.107954106 +0000 UTC m=+286.604281350" watchObservedRunningTime="2026-04-16 14:57:17.109266978 +0000 UTC m=+286.605594206" Apr 16 14:57:20.924070 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:20.924037 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67d6d8c7c-q7n7s"] Apr 16 14:57:20.924505 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:20.924388 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" containerName="console" Apr 16 14:57:20.924505 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:20.924402 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" containerName="console" Apr 16 14:57:20.924505 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:20.924423 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f72564b-5bc1-45cd-a160-ee6db0d5d01e" containerName="console" Apr 16 14:57:20.924505 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:20.924433 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f72564b-5bc1-45cd-a160-ee6db0d5d01e" containerName="console" Apr 16 14:57:20.924505 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:20.924484 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9a1ac58-5e77-4d51-aa92-a72d754ecbc0" containerName="console" Apr 16 14:57:20.924505 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:20.924498 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f72564b-5bc1-45cd-a160-ee6db0d5d01e" containerName="console" Apr 16 14:57:20.927545 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:20.927516 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:20.938722 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:20.938700 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67d6d8c7c-q7n7s"] Apr 16 14:57:21.055663 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.055632 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-trusted-ca-bundle\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.055784 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.055688 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-console-config\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.055784 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.055756 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-service-ca\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.055909 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.055785 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swpkn\" (UniqueName: \"kubernetes.io/projected/50e21f6d-8a71-43f5-9b50-0d046241888e-kube-api-access-swpkn\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.055909 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.055808 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-serving-cert\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.055909 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.055846 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-oauth-serving-cert\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.056026 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.055904 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-oauth-config\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.157132 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.157106 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-oauth-config\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.157263 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.157138 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-trusted-ca-bundle\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.157263 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.157182 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-console-config\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.157263 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.157220 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-service-ca\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.157263 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.157245 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swpkn\" (UniqueName: \"kubernetes.io/projected/50e21f6d-8a71-43f5-9b50-0d046241888e-kube-api-access-swpkn\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.157446 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.157294 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-serving-cert\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.158362 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.157703 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-oauth-serving-cert\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.158362 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.158074 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-console-config\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.158362 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.158220 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-trusted-ca-bundle\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.158362 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.158304 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-service-ca\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.158627 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.158545 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-oauth-serving-cert\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.159933 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.159897 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-oauth-config\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.160120 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.160100 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-serving-cert\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.164612 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.164593 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swpkn\" (UniqueName: \"kubernetes.io/projected/50e21f6d-8a71-43f5-9b50-0d046241888e-kube-api-access-swpkn\") pod \"console-67d6d8c7c-q7n7s\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.236767 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.236711 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:21.355975 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:21.355945 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67d6d8c7c-q7n7s"] Apr 16 14:57:21.359778 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:57:21.359746 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50e21f6d_8a71_43f5_9b50_0d046241888e.slice/crio-2a539926b5fabd2461c9ea954e7dd994d831a7f843842ce9260927402d69f429 WatchSource:0}: Error finding container 2a539926b5fabd2461c9ea954e7dd994d831a7f843842ce9260927402d69f429: Status 404 returned error can't find the container with id 2a539926b5fabd2461c9ea954e7dd994d831a7f843842ce9260927402d69f429 Apr 16 14:57:22.094931 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:22.094893 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d6d8c7c-q7n7s" event={"ID":"50e21f6d-8a71-43f5-9b50-0d046241888e","Type":"ContainerStarted","Data":"186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe"} Apr 16 14:57:22.094931 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:22.094929 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d6d8c7c-q7n7s" event={"ID":"50e21f6d-8a71-43f5-9b50-0d046241888e","Type":"ContainerStarted","Data":"2a539926b5fabd2461c9ea954e7dd994d831a7f843842ce9260927402d69f429"} Apr 16 14:57:22.112458 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:22.112412 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67d6d8c7c-q7n7s" podStartSLOduration=2.112398148 podStartE2EDuration="2.112398148s" podCreationTimestamp="2026-04-16 14:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:57:22.110331073 +0000 UTC m=+291.606658326" watchObservedRunningTime="2026-04-16 14:57:22.112398148 +0000 UTC m=+291.608725377" Apr 16 14:57:27.084706 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:27.084676 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8pv4t" Apr 16 14:57:27.819554 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:27.819525 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67d6d8c7c-q7n7s"] Apr 16 14:57:27.883415 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:27.883383 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dd6457dd7-xtwqr"] Apr 16 14:57:27.886676 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:27.886662 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:27.898983 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:27.898961 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dd6457dd7-xtwqr"] Apr 16 14:57:28.010019 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.009990 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-serving-cert\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.010019 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.010021 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-oauth-serving-cert\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.010181 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.010041 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-config\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.010181 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.010065 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-trusted-ca-bundle\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.010181 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.010138 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-service-ca\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.010181 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.010162 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvf2l\" (UniqueName: \"kubernetes.io/projected/38fdff81-688b-4851-b9ff-98c8df36e3f6-kube-api-access-nvf2l\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.010304 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.010244 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-oauth-config\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.110656 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.110592 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-oauth-config\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.110656 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.110637 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-serving-cert\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.110656 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.110653 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-oauth-serving-cert\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.111134 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.110671 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-config\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.111134 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.110932 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-trusted-ca-bundle\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.111134 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.110955 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-service-ca\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.111134 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.110982 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvf2l\" (UniqueName: \"kubernetes.io/projected/38fdff81-688b-4851-b9ff-98c8df36e3f6-kube-api-access-nvf2l\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.112865 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.112839 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-oauth-serving-cert\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.112987 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.112940 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-config\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.113087 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.113060 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-service-ca\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.113331 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.113315 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-trusted-ca-bundle\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.114748 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.114727 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-serving-cert\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.115274 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.115253 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-oauth-config\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.118808 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.118781 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvf2l\" (UniqueName: \"kubernetes.io/projected/38fdff81-688b-4851-b9ff-98c8df36e3f6-kube-api-access-nvf2l\") pod \"console-6dd6457dd7-xtwqr\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.195594 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.195563 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:28.312734 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:28.312712 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dd6457dd7-xtwqr"] Apr 16 14:57:28.315081 ip-10-0-139-101 kubenswrapper[2582]: W0416 14:57:28.315049 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38fdff81_688b_4851_b9ff_98c8df36e3f6.slice/crio-e5395a18242110362b6d6aeb1d1c07f28053a12959dcbfd58f7454057ecc75d3 WatchSource:0}: Error finding container e5395a18242110362b6d6aeb1d1c07f28053a12959dcbfd58f7454057ecc75d3: Status 404 returned error can't find the container with id e5395a18242110362b6d6aeb1d1c07f28053a12959dcbfd58f7454057ecc75d3 Apr 16 14:57:29.116325 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:29.116277 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dd6457dd7-xtwqr" event={"ID":"38fdff81-688b-4851-b9ff-98c8df36e3f6","Type":"ContainerStarted","Data":"c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07"} Apr 16 14:57:29.116786 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:29.116329 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dd6457dd7-xtwqr" event={"ID":"38fdff81-688b-4851-b9ff-98c8df36e3f6","Type":"ContainerStarted","Data":"e5395a18242110362b6d6aeb1d1c07f28053a12959dcbfd58f7454057ecc75d3"} Apr 16 14:57:29.134210 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:29.134164 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dd6457dd7-xtwqr" podStartSLOduration=2.134151404 podStartE2EDuration="2.134151404s" podCreationTimestamp="2026-04-16 14:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:57:29.13198016 +0000 UTC m=+298.628307396" watchObservedRunningTime="2026-04-16 14:57:29.134151404 +0000 UTC m=+298.630478632" Apr 16 14:57:31.055776 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:31.055750 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 14:57:31.056292 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:31.055960 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 14:57:31.236888 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:31.236865 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:38.196424 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:38.196389 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:38.196812 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:38.196450 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:38.201264 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:38.201241 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:39.149961 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:39.149925 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 14:57:39.195689 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:39.195651 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5668486f94-lktk2"] Apr 16 14:57:52.838266 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:52.838210 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67d6d8c7c-q7n7s" podUID="50e21f6d-8a71-43f5-9b50-0d046241888e" containerName="console" containerID="cri-o://186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe" gracePeriod=15 Apr 16 14:57:53.080395 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.080371 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67d6d8c7c-q7n7s_50e21f6d-8a71-43f5-9b50-0d046241888e/console/0.log" Apr 16 14:57:53.080504 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.080441 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:53.187713 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.187690 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67d6d8c7c-q7n7s_50e21f6d-8a71-43f5-9b50-0d046241888e/console/0.log" Apr 16 14:57:53.187877 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.187728 2582 generic.go:358] "Generic (PLEG): container finished" podID="50e21f6d-8a71-43f5-9b50-0d046241888e" containerID="186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe" exitCode=2 Apr 16 14:57:53.187877 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.187764 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d6d8c7c-q7n7s" event={"ID":"50e21f6d-8a71-43f5-9b50-0d046241888e","Type":"ContainerDied","Data":"186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe"} Apr 16 14:57:53.187877 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.187801 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d6d8c7c-q7n7s" event={"ID":"50e21f6d-8a71-43f5-9b50-0d046241888e","Type":"ContainerDied","Data":"2a539926b5fabd2461c9ea954e7dd994d831a7f843842ce9260927402d69f429"} Apr 16 14:57:53.187877 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.187801 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d6d8c7c-q7n7s" Apr 16 14:57:53.187877 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.187813 2582 scope.go:117] "RemoveContainer" containerID="186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe" Apr 16 14:57:53.195858 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.195837 2582 scope.go:117] "RemoveContainer" containerID="186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe" Apr 16 14:57:53.196112 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:57:53.196096 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe\": container with ID starting with 186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe not found: ID does not exist" containerID="186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe" Apr 16 14:57:53.196162 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.196123 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe"} err="failed to get container status \"186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe\": rpc error: code = NotFound desc = could not find container \"186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe\": container with ID starting with 186910066e9c4c3f6a7c26bc99b12130685b7ea9ad8e0ff6c1d05df8746749fe not found: ID does not exist" Apr 16 14:57:53.202381 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.202362 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-trusted-ca-bundle\") pod \"50e21f6d-8a71-43f5-9b50-0d046241888e\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " Apr 16 14:57:53.202448 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.202405 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-oauth-serving-cert\") pod \"50e21f6d-8a71-43f5-9b50-0d046241888e\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " Apr 16 14:57:53.202448 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.202421 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-serving-cert\") pod \"50e21f6d-8a71-43f5-9b50-0d046241888e\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " Apr 16 14:57:53.202448 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.202435 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-oauth-config\") pod \"50e21f6d-8a71-43f5-9b50-0d046241888e\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " Apr 16 14:57:53.202571 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.202468 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swpkn\" (UniqueName: \"kubernetes.io/projected/50e21f6d-8a71-43f5-9b50-0d046241888e-kube-api-access-swpkn\") pod \"50e21f6d-8a71-43f5-9b50-0d046241888e\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " Apr 16 14:57:53.202571 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.202507 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-service-ca\") pod \"50e21f6d-8a71-43f5-9b50-0d046241888e\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " Apr 16 14:57:53.202571 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.202553 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-console-config\") pod \"50e21f6d-8a71-43f5-9b50-0d046241888e\" (UID: \"50e21f6d-8a71-43f5-9b50-0d046241888e\") " Apr 16 14:57:53.202946 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.202898 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "50e21f6d-8a71-43f5-9b50-0d046241888e" (UID: "50e21f6d-8a71-43f5-9b50-0d046241888e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:53.202946 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.202905 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "50e21f6d-8a71-43f5-9b50-0d046241888e" (UID: "50e21f6d-8a71-43f5-9b50-0d046241888e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:53.203092 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.203039 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-service-ca" (OuterVolumeSpecName: "service-ca") pod "50e21f6d-8a71-43f5-9b50-0d046241888e" (UID: "50e21f6d-8a71-43f5-9b50-0d046241888e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:53.203092 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.203070 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-console-config" (OuterVolumeSpecName: "console-config") pod "50e21f6d-8a71-43f5-9b50-0d046241888e" (UID: "50e21f6d-8a71-43f5-9b50-0d046241888e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:53.204643 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.204619 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "50e21f6d-8a71-43f5-9b50-0d046241888e" (UID: "50e21f6d-8a71-43f5-9b50-0d046241888e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:53.204730 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.204709 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "50e21f6d-8a71-43f5-9b50-0d046241888e" (UID: "50e21f6d-8a71-43f5-9b50-0d046241888e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:53.204857 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.204804 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e21f6d-8a71-43f5-9b50-0d046241888e-kube-api-access-swpkn" (OuterVolumeSpecName: "kube-api-access-swpkn") pod "50e21f6d-8a71-43f5-9b50-0d046241888e" (UID: "50e21f6d-8a71-43f5-9b50-0d046241888e"). InnerVolumeSpecName "kube-api-access-swpkn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:57:53.303523 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.303497 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-trusted-ca-bundle\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:57:53.303523 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.303519 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-oauth-serving-cert\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:57:53.303648 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.303529 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-serving-cert\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:57:53.303648 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.303539 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50e21f6d-8a71-43f5-9b50-0d046241888e-console-oauth-config\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:57:53.303648 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.303548 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-swpkn\" (UniqueName: \"kubernetes.io/projected/50e21f6d-8a71-43f5-9b50-0d046241888e-kube-api-access-swpkn\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:57:53.303648 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.303558 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-service-ca\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:57:53.303648 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.303567 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50e21f6d-8a71-43f5-9b50-0d046241888e-console-config\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:57:53.509181 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.509146 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67d6d8c7c-q7n7s"] Apr 16 14:57:53.514152 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:53.514132 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67d6d8c7c-q7n7s"] Apr 16 14:57:55.165714 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:57:55.165676 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e21f6d-8a71-43f5-9b50-0d046241888e" path="/var/lib/kubelet/pods/50e21f6d-8a71-43f5-9b50-0d046241888e/volumes" Apr 16 14:58:04.217752 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.217692 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5668486f94-lktk2" podUID="acbdb581-cce6-4f09-a211-90e985f568f8" containerName="console" containerID="cri-o://ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797" gracePeriod=15 Apr 16 14:58:04.453072 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.453050 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5668486f94-lktk2_acbdb581-cce6-4f09-a211-90e985f568f8/console/0.log" Apr 16 14:58:04.453189 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.453109 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:58:04.587478 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.587413 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-serving-cert\") pod \"acbdb581-cce6-4f09-a211-90e985f568f8\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " Apr 16 14:58:04.587478 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.587444 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-service-ca\") pod \"acbdb581-cce6-4f09-a211-90e985f568f8\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " Apr 16 14:58:04.587478 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.587463 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-oauth-config\") pod \"acbdb581-cce6-4f09-a211-90e985f568f8\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " Apr 16 14:58:04.587732 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.587501 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-console-config\") pod \"acbdb581-cce6-4f09-a211-90e985f568f8\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " Apr 16 14:58:04.587732 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.587530 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-trusted-ca-bundle\") pod \"acbdb581-cce6-4f09-a211-90e985f568f8\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " Apr 16 14:58:04.587732 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.587564 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvxn4\" (UniqueName: \"kubernetes.io/projected/acbdb581-cce6-4f09-a211-90e985f568f8-kube-api-access-fvxn4\") pod \"acbdb581-cce6-4f09-a211-90e985f568f8\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " Apr 16 14:58:04.587732 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.587614 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-oauth-serving-cert\") pod \"acbdb581-cce6-4f09-a211-90e985f568f8\" (UID: \"acbdb581-cce6-4f09-a211-90e985f568f8\") " Apr 16 14:58:04.587964 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.587920 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-service-ca" (OuterVolumeSpecName: "service-ca") pod "acbdb581-cce6-4f09-a211-90e985f568f8" (UID: "acbdb581-cce6-4f09-a211-90e985f568f8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:58:04.588096 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.587980 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-console-config" (OuterVolumeSpecName: "console-config") pod "acbdb581-cce6-4f09-a211-90e985f568f8" (UID: "acbdb581-cce6-4f09-a211-90e985f568f8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:58:04.588155 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.588073 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "acbdb581-cce6-4f09-a211-90e985f568f8" (UID: "acbdb581-cce6-4f09-a211-90e985f568f8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:58:04.588373 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.588350 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "acbdb581-cce6-4f09-a211-90e985f568f8" (UID: "acbdb581-cce6-4f09-a211-90e985f568f8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:58:04.589720 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.589701 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "acbdb581-cce6-4f09-a211-90e985f568f8" (UID: "acbdb581-cce6-4f09-a211-90e985f568f8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:58:04.590322 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.590302 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acbdb581-cce6-4f09-a211-90e985f568f8-kube-api-access-fvxn4" (OuterVolumeSpecName: "kube-api-access-fvxn4") pod "acbdb581-cce6-4f09-a211-90e985f568f8" (UID: "acbdb581-cce6-4f09-a211-90e985f568f8"). InnerVolumeSpecName "kube-api-access-fvxn4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:58:04.590322 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.590305 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "acbdb581-cce6-4f09-a211-90e985f568f8" (UID: "acbdb581-cce6-4f09-a211-90e985f568f8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:58:04.688593 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.688568 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-serving-cert\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:58:04.688593 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.688591 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-service-ca\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:58:04.688732 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.688601 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acbdb581-cce6-4f09-a211-90e985f568f8-console-oauth-config\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:58:04.688732 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.688610 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-console-config\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:58:04.688732 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.688619 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-trusted-ca-bundle\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:58:04.688732 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.688628 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fvxn4\" (UniqueName: \"kubernetes.io/projected/acbdb581-cce6-4f09-a211-90e985f568f8-kube-api-access-fvxn4\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:58:04.688732 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:04.688636 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acbdb581-cce6-4f09-a211-90e985f568f8-oauth-serving-cert\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 14:58:05.221968 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:05.221946 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5668486f94-lktk2_acbdb581-cce6-4f09-a211-90e985f568f8/console/0.log" Apr 16 14:58:05.222316 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:05.221988 2582 generic.go:358] "Generic (PLEG): container finished" podID="acbdb581-cce6-4f09-a211-90e985f568f8" containerID="ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797" exitCode=2 Apr 16 14:58:05.222316 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:05.222061 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5668486f94-lktk2" Apr 16 14:58:05.222316 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:05.222080 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5668486f94-lktk2" event={"ID":"acbdb581-cce6-4f09-a211-90e985f568f8","Type":"ContainerDied","Data":"ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797"} Apr 16 14:58:05.222316 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:05.222121 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5668486f94-lktk2" event={"ID":"acbdb581-cce6-4f09-a211-90e985f568f8","Type":"ContainerDied","Data":"9f4239f1ea319ff3557399c97e7ffa6187180c92c838e3a571ad1f7701618050"} Apr 16 14:58:05.222316 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:05.222137 2582 scope.go:117] "RemoveContainer" containerID="ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797" Apr 16 14:58:05.229748 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:05.229731 2582 scope.go:117] "RemoveContainer" containerID="ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797" Apr 16 14:58:05.229993 ip-10-0-139-101 kubenswrapper[2582]: E0416 14:58:05.229976 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797\": container with ID starting with ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797 not found: ID does not exist" containerID="ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797" Apr 16 14:58:05.230040 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:05.230000 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797"} err="failed to get container status \"ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797\": rpc error: code = NotFound desc = could not find container \"ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797\": container with ID starting with ae58a71c37a52b4405f452f4c8bd8cd186006d3e38d8afef06a2e19e7dede797 not found: ID does not exist" Apr 16 14:58:05.239414 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:05.239384 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5668486f94-lktk2"] Apr 16 14:58:05.242677 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:05.242658 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5668486f94-lktk2"] Apr 16 14:58:07.162011 ip-10-0-139-101 kubenswrapper[2582]: I0416 14:58:07.161982 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acbdb581-cce6-4f09-a211-90e985f568f8" path="/var/lib/kubelet/pods/acbdb581-cce6-4f09-a211-90e985f568f8/volumes" Apr 16 15:02:22.922323 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:22.922286 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66dd85d545-vp9vx"] Apr 16 15:02:22.922774 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:22.922603 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acbdb581-cce6-4f09-a211-90e985f568f8" containerName="console" Apr 16 15:02:22.922774 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:22.922615 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbdb581-cce6-4f09-a211-90e985f568f8" containerName="console" Apr 16 15:02:22.922774 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:22.922634 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50e21f6d-8a71-43f5-9b50-0d046241888e" containerName="console" Apr 16 15:02:22.922774 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:22.922640 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e21f6d-8a71-43f5-9b50-0d046241888e" containerName="console" Apr 16 15:02:22.922774 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:22.922686 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="50e21f6d-8a71-43f5-9b50-0d046241888e" containerName="console" Apr 16 15:02:22.922774 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:22.922695 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="acbdb581-cce6-4f09-a211-90e985f568f8" containerName="console" Apr 16 15:02:22.925400 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:22.925377 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:22.935019 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:22.934997 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66dd85d545-vp9vx"] Apr 16 15:02:23.039091 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.039059 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-trusted-ca-bundle\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.039218 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.039095 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-console-config\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.039218 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.039129 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-service-ca\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.039295 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.039217 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9681bf36-ced1-49db-b513-9136b424b1c4-console-serving-cert\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.039295 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.039244 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbmbz\" (UniqueName: \"kubernetes.io/projected/9681bf36-ced1-49db-b513-9136b424b1c4-kube-api-access-kbmbz\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.039295 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.039264 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-oauth-serving-cert\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.039448 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.039354 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9681bf36-ced1-49db-b513-9136b424b1c4-console-oauth-config\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.139756 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.139728 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-service-ca\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.139907 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.139765 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9681bf36-ced1-49db-b513-9136b424b1c4-console-serving-cert\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.139907 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.139781 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbmbz\" (UniqueName: \"kubernetes.io/projected/9681bf36-ced1-49db-b513-9136b424b1c4-kube-api-access-kbmbz\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.139907 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.139801 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-oauth-serving-cert\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.139907 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.139857 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9681bf36-ced1-49db-b513-9136b424b1c4-console-oauth-config\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.139907 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.139883 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-trusted-ca-bundle\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.139907 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.139903 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-console-config\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.140598 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.140513 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-service-ca\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.140598 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.140513 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-oauth-serving-cert\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.140598 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.140523 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-console-config\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.141138 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.141074 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9681bf36-ced1-49db-b513-9136b424b1c4-trusted-ca-bundle\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.142421 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.142403 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9681bf36-ced1-49db-b513-9136b424b1c4-console-serving-cert\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.142511 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.142496 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9681bf36-ced1-49db-b513-9136b424b1c4-console-oauth-config\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.146840 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.146804 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbmbz\" (UniqueName: \"kubernetes.io/projected/9681bf36-ced1-49db-b513-9136b424b1c4-kube-api-access-kbmbz\") pod \"console-66dd85d545-vp9vx\" (UID: \"9681bf36-ced1-49db-b513-9136b424b1c4\") " pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.236169 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.236090 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:23.357163 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.357124 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66dd85d545-vp9vx"] Apr 16 15:02:23.360483 ip-10-0-139-101 kubenswrapper[2582]: W0416 15:02:23.360454 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9681bf36_ced1_49db_b513_9136b424b1c4.slice/crio-2cd4ead98b8144bd40614e6f30664f5431729a568110f6621de432391c787d95 WatchSource:0}: Error finding container 2cd4ead98b8144bd40614e6f30664f5431729a568110f6621de432391c787d95: Status 404 returned error can't find the container with id 2cd4ead98b8144bd40614e6f30664f5431729a568110f6621de432391c787d95 Apr 16 15:02:23.362452 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.362433 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:02:23.937641 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.937605 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66dd85d545-vp9vx" event={"ID":"9681bf36-ced1-49db-b513-9136b424b1c4","Type":"ContainerStarted","Data":"176c16c6f4f1132fb3228ac9790c59d6b64dde5d67e1ebf9759746b0da681e13"} Apr 16 15:02:23.937641 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.937643 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66dd85d545-vp9vx" event={"ID":"9681bf36-ced1-49db-b513-9136b424b1c4","Type":"ContainerStarted","Data":"2cd4ead98b8144bd40614e6f30664f5431729a568110f6621de432391c787d95"} Apr 16 15:02:23.955757 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:23.955711 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66dd85d545-vp9vx" podStartSLOduration=1.9556959219999999 podStartE2EDuration="1.955695922s" podCreationTimestamp="2026-04-16 15:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:02:23.954665639 +0000 UTC m=+593.450992867" watchObservedRunningTime="2026-04-16 15:02:23.955695922 +0000 UTC m=+593.452023152" Apr 16 15:02:31.075990 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:31.075966 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:02:31.077245 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:31.077221 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:02:33.236642 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:33.236547 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:33.236642 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:33.236600 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:33.241323 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:33.241302 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:33.969597 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:33.969570 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66dd85d545-vp9vx" Apr 16 15:02:34.028497 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:34.028465 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dd6457dd7-xtwqr"] Apr 16 15:02:59.049205 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.049138 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6dd6457dd7-xtwqr" podUID="38fdff81-688b-4851-b9ff-98c8df36e3f6" containerName="console" containerID="cri-o://c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07" gracePeriod=15 Apr 16 15:02:59.147274 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.147245 2582 patch_prober.go:28] interesting pod/console-6dd6457dd7-xtwqr container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.133.0.23:8443/health\": dial tcp 10.133.0.23:8443: connect: connection refused" start-of-body= Apr 16 15:02:59.147389 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.147295 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-6dd6457dd7-xtwqr" podUID="38fdff81-688b-4851-b9ff-98c8df36e3f6" containerName="console" probeResult="failure" output="Get \"https://10.133.0.23:8443/health\": dial tcp 10.133.0.23:8443: connect: connection refused" Apr 16 15:02:59.290058 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.290038 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dd6457dd7-xtwqr_38fdff81-688b-4851-b9ff-98c8df36e3f6/console/0.log" Apr 16 15:02:59.290157 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.290096 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 15:02:59.399352 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.399318 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-oauth-serving-cert\") pod \"38fdff81-688b-4851-b9ff-98c8df36e3f6\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " Apr 16 15:02:59.399525 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.399364 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvf2l\" (UniqueName: \"kubernetes.io/projected/38fdff81-688b-4851-b9ff-98c8df36e3f6-kube-api-access-nvf2l\") pod \"38fdff81-688b-4851-b9ff-98c8df36e3f6\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " Apr 16 15:02:59.399525 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.399384 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-service-ca\") pod \"38fdff81-688b-4851-b9ff-98c8df36e3f6\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " Apr 16 15:02:59.399525 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.399421 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-serving-cert\") pod \"38fdff81-688b-4851-b9ff-98c8df36e3f6\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " Apr 16 15:02:59.399701 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.399579 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-oauth-config\") pod \"38fdff81-688b-4851-b9ff-98c8df36e3f6\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " Apr 16 15:02:59.399701 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.399658 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-trusted-ca-bundle\") pod \"38fdff81-688b-4851-b9ff-98c8df36e3f6\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " Apr 16 15:02:59.399701 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.399688 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-config\") pod \"38fdff81-688b-4851-b9ff-98c8df36e3f6\" (UID: \"38fdff81-688b-4851-b9ff-98c8df36e3f6\") " Apr 16 15:02:59.399887 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.399776 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "38fdff81-688b-4851-b9ff-98c8df36e3f6" (UID: "38fdff81-688b-4851-b9ff-98c8df36e3f6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:02:59.399955 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.399867 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-service-ca" (OuterVolumeSpecName: "service-ca") pod "38fdff81-688b-4851-b9ff-98c8df36e3f6" (UID: "38fdff81-688b-4851-b9ff-98c8df36e3f6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:02:59.400223 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.400201 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-config" (OuterVolumeSpecName: "console-config") pod "38fdff81-688b-4851-b9ff-98c8df36e3f6" (UID: "38fdff81-688b-4851-b9ff-98c8df36e3f6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:02:59.400303 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.400201 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-oauth-serving-cert\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 15:02:59.400341 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.400307 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-service-ca\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 15:02:59.400341 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.400216 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "38fdff81-688b-4851-b9ff-98c8df36e3f6" (UID: "38fdff81-688b-4851-b9ff-98c8df36e3f6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:02:59.401752 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.401728 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38fdff81-688b-4851-b9ff-98c8df36e3f6-kube-api-access-nvf2l" (OuterVolumeSpecName: "kube-api-access-nvf2l") pod "38fdff81-688b-4851-b9ff-98c8df36e3f6" (UID: "38fdff81-688b-4851-b9ff-98c8df36e3f6"). InnerVolumeSpecName "kube-api-access-nvf2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:02:59.401892 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.401755 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "38fdff81-688b-4851-b9ff-98c8df36e3f6" (UID: "38fdff81-688b-4851-b9ff-98c8df36e3f6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:02:59.401892 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.401778 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "38fdff81-688b-4851-b9ff-98c8df36e3f6" (UID: "38fdff81-688b-4851-b9ff-98c8df36e3f6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:02:59.500732 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.500714 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-trusted-ca-bundle\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 15:02:59.500732 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.500734 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-config\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 15:02:59.500880 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.500743 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nvf2l\" (UniqueName: \"kubernetes.io/projected/38fdff81-688b-4851-b9ff-98c8df36e3f6-kube-api-access-nvf2l\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 15:02:59.500880 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.500754 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-serving-cert\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 15:02:59.500880 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:02:59.500763 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/38fdff81-688b-4851-b9ff-98c8df36e3f6-console-oauth-config\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 15:03:00.040116 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:03:00.040090 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dd6457dd7-xtwqr_38fdff81-688b-4851-b9ff-98c8df36e3f6/console/0.log" Apr 16 15:03:00.040279 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:03:00.040128 2582 generic.go:358] "Generic (PLEG): container finished" podID="38fdff81-688b-4851-b9ff-98c8df36e3f6" containerID="c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07" exitCode=2 Apr 16 15:03:00.040279 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:03:00.040187 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dd6457dd7-xtwqr" Apr 16 15:03:00.040279 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:03:00.040209 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dd6457dd7-xtwqr" event={"ID":"38fdff81-688b-4851-b9ff-98c8df36e3f6","Type":"ContainerDied","Data":"c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07"} Apr 16 15:03:00.040279 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:03:00.040251 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dd6457dd7-xtwqr" event={"ID":"38fdff81-688b-4851-b9ff-98c8df36e3f6","Type":"ContainerDied","Data":"e5395a18242110362b6d6aeb1d1c07f28053a12959dcbfd58f7454057ecc75d3"} Apr 16 15:03:00.040279 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:03:00.040266 2582 scope.go:117] "RemoveContainer" containerID="c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07" Apr 16 15:03:00.048163 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:03:00.048144 2582 scope.go:117] "RemoveContainer" containerID="c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07" Apr 16 15:03:00.048453 ip-10-0-139-101 kubenswrapper[2582]: E0416 15:03:00.048426 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07\": container with ID starting with c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07 not found: ID does not exist" containerID="c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07" Apr 16 15:03:00.048527 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:03:00.048456 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07"} err="failed to get container status \"c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07\": rpc error: code = NotFound desc = could not find container \"c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07\": container with ID starting with c4e01fe261452e8c32c7d9fc390c712861819b960109bd4a16ccc2a2b0290d07 not found: ID does not exist" Apr 16 15:03:00.064803 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:03:00.064781 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dd6457dd7-xtwqr"] Apr 16 15:03:00.068764 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:03:00.068744 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6dd6457dd7-xtwqr"] Apr 16 15:03:01.161311 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:03:01.161281 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38fdff81-688b-4851-b9ff-98c8df36e3f6" path="/var/lib/kubelet/pods/38fdff81-688b-4851-b9ff-98c8df36e3f6/volumes" Apr 16 15:05:56.733553 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.733520 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-sv99g"] Apr 16 15:05:56.734012 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.733854 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38fdff81-688b-4851-b9ff-98c8df36e3f6" containerName="console" Apr 16 15:05:56.734012 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.733867 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fdff81-688b-4851-b9ff-98c8df36e3f6" containerName="console" Apr 16 15:05:56.734012 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.733929 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="38fdff81-688b-4851-b9ff-98c8df36e3f6" containerName="console" Apr 16 15:05:56.736777 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.736763 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sv99g" Apr 16 15:05:56.739238 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.739213 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:05:56.739367 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.739253 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 15:05:56.739367 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.739278 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:05:56.739367 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.739323 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9zz8m\"" Apr 16 15:05:56.743455 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.743427 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sv99g"] Apr 16 15:05:56.749099 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.749078 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc8n7\" (UniqueName: \"kubernetes.io/projected/33eee4bc-1e4b-479d-921f-cab6fecdb99f-kube-api-access-tc8n7\") pod \"s3-init-sv99g\" (UID: \"33eee4bc-1e4b-479d-921f-cab6fecdb99f\") " pod="kserve/s3-init-sv99g" Apr 16 15:05:56.849740 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.849710 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc8n7\" (UniqueName: \"kubernetes.io/projected/33eee4bc-1e4b-479d-921f-cab6fecdb99f-kube-api-access-tc8n7\") pod \"s3-init-sv99g\" (UID: \"33eee4bc-1e4b-479d-921f-cab6fecdb99f\") " pod="kserve/s3-init-sv99g" Apr 16 15:05:56.857502 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:56.857476 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc8n7\" (UniqueName: \"kubernetes.io/projected/33eee4bc-1e4b-479d-921f-cab6fecdb99f-kube-api-access-tc8n7\") pod \"s3-init-sv99g\" (UID: \"33eee4bc-1e4b-479d-921f-cab6fecdb99f\") " pod="kserve/s3-init-sv99g" Apr 16 15:05:57.054110 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:57.054044 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sv99g" Apr 16 15:05:57.177362 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:57.177340 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sv99g"] Apr 16 15:05:57.179944 ip-10-0-139-101 kubenswrapper[2582]: W0416 15:05:57.179913 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33eee4bc_1e4b_479d_921f_cab6fecdb99f.slice/crio-3cc0980b8ff0811d438d04fdc18481fb596b27310799a8969967306c86d7cd99 WatchSource:0}: Error finding container 3cc0980b8ff0811d438d04fdc18481fb596b27310799a8969967306c86d7cd99: Status 404 returned error can't find the container with id 3cc0980b8ff0811d438d04fdc18481fb596b27310799a8969967306c86d7cd99 Apr 16 15:05:57.525996 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:05:57.525963 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sv99g" event={"ID":"33eee4bc-1e4b-479d-921f-cab6fecdb99f","Type":"ContainerStarted","Data":"3cc0980b8ff0811d438d04fdc18481fb596b27310799a8969967306c86d7cd99"} Apr 16 15:06:02.547314 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:02.547277 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sv99g" event={"ID":"33eee4bc-1e4b-479d-921f-cab6fecdb99f","Type":"ContainerStarted","Data":"4be56d23066ce020a861a72a5fa9d770f88a0e298dfbc9ac101312d8fc144509"} Apr 16 15:06:02.561595 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:02.561550 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-sv99g" podStartSLOduration=2.17433569 podStartE2EDuration="6.561535695s" podCreationTimestamp="2026-04-16 15:05:56 +0000 UTC" firstStartedPulling="2026-04-16 15:05:57.18169959 +0000 UTC m=+806.678026796" lastFinishedPulling="2026-04-16 15:06:01.568899595 +0000 UTC m=+811.065226801" observedRunningTime="2026-04-16 15:06:02.560804082 +0000 UTC m=+812.057131312" watchObservedRunningTime="2026-04-16 15:06:02.561535695 +0000 UTC m=+812.057862925" Apr 16 15:06:04.554837 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:04.554802 2582 generic.go:358] "Generic (PLEG): container finished" podID="33eee4bc-1e4b-479d-921f-cab6fecdb99f" containerID="4be56d23066ce020a861a72a5fa9d770f88a0e298dfbc9ac101312d8fc144509" exitCode=0 Apr 16 15:06:04.555130 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:04.554852 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sv99g" event={"ID":"33eee4bc-1e4b-479d-921f-cab6fecdb99f","Type":"ContainerDied","Data":"4be56d23066ce020a861a72a5fa9d770f88a0e298dfbc9ac101312d8fc144509"} Apr 16 15:06:05.683440 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:05.683421 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sv99g" Apr 16 15:06:05.714286 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:05.714263 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc8n7\" (UniqueName: \"kubernetes.io/projected/33eee4bc-1e4b-479d-921f-cab6fecdb99f-kube-api-access-tc8n7\") pod \"33eee4bc-1e4b-479d-921f-cab6fecdb99f\" (UID: \"33eee4bc-1e4b-479d-921f-cab6fecdb99f\") " Apr 16 15:06:05.716490 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:05.716464 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33eee4bc-1e4b-479d-921f-cab6fecdb99f-kube-api-access-tc8n7" (OuterVolumeSpecName: "kube-api-access-tc8n7") pod "33eee4bc-1e4b-479d-921f-cab6fecdb99f" (UID: "33eee4bc-1e4b-479d-921f-cab6fecdb99f"). InnerVolumeSpecName "kube-api-access-tc8n7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:06:05.815453 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:05.815394 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tc8n7\" (UniqueName: \"kubernetes.io/projected/33eee4bc-1e4b-479d-921f-cab6fecdb99f-kube-api-access-tc8n7\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 15:06:06.561015 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:06.560981 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sv99g" event={"ID":"33eee4bc-1e4b-479d-921f-cab6fecdb99f","Type":"ContainerDied","Data":"3cc0980b8ff0811d438d04fdc18481fb596b27310799a8969967306c86d7cd99"} Apr 16 15:06:06.561015 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:06.561013 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cc0980b8ff0811d438d04fdc18481fb596b27310799a8969967306c86d7cd99" Apr 16 15:06:06.561015 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:06.560993 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sv99g" Apr 16 15:06:16.296547 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.296517 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-7j5f2"] Apr 16 15:06:16.296949 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.296804 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33eee4bc-1e4b-479d-921f-cab6fecdb99f" containerName="s3-init" Apr 16 15:06:16.296949 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.296815 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="33eee4bc-1e4b-479d-921f-cab6fecdb99f" containerName="s3-init" Apr 16 15:06:16.296949 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.296893 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="33eee4bc-1e4b-479d-921f-cab6fecdb99f" containerName="s3-init" Apr 16 15:06:16.299656 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.299640 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7j5f2" Apr 16 15:06:16.301950 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.301920 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:06:16.302093 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.301967 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9zz8m\"" Apr 16 15:06:16.302093 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.302013 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 15:06:16.303022 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.303004 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:06:16.306906 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.306705 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-7j5f2"] Apr 16 15:06:16.384379 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.384354 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnfw\" (UniqueName: \"kubernetes.io/projected/a6f42147-3fde-4faa-b673-ce0560ee184e-kube-api-access-qfnfw\") pod \"s3-tls-init-custom-7j5f2\" (UID: \"a6f42147-3fde-4faa-b673-ce0560ee184e\") " pod="kserve/s3-tls-init-custom-7j5f2" Apr 16 15:06:16.485742 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.485709 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnfw\" (UniqueName: \"kubernetes.io/projected/a6f42147-3fde-4faa-b673-ce0560ee184e-kube-api-access-qfnfw\") pod \"s3-tls-init-custom-7j5f2\" (UID: \"a6f42147-3fde-4faa-b673-ce0560ee184e\") " pod="kserve/s3-tls-init-custom-7j5f2" Apr 16 15:06:16.493295 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.493270 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnfw\" (UniqueName: \"kubernetes.io/projected/a6f42147-3fde-4faa-b673-ce0560ee184e-kube-api-access-qfnfw\") pod \"s3-tls-init-custom-7j5f2\" (UID: \"a6f42147-3fde-4faa-b673-ce0560ee184e\") " pod="kserve/s3-tls-init-custom-7j5f2" Apr 16 15:06:16.623379 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.623321 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7j5f2" Apr 16 15:06:16.740572 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:16.740546 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-7j5f2"] Apr 16 15:06:16.743392 ip-10-0-139-101 kubenswrapper[2582]: W0416 15:06:16.743360 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6f42147_3fde_4faa_b673_ce0560ee184e.slice/crio-d3d4d7ca99dfe835e51faea228b4c8b0ec36d10494d45bc3c61c1cea0bb4f6c0 WatchSource:0}: Error finding container d3d4d7ca99dfe835e51faea228b4c8b0ec36d10494d45bc3c61c1cea0bb4f6c0: Status 404 returned error can't find the container with id d3d4d7ca99dfe835e51faea228b4c8b0ec36d10494d45bc3c61c1cea0bb4f6c0 Apr 16 15:06:17.590385 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:17.590348 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7j5f2" event={"ID":"a6f42147-3fde-4faa-b673-ce0560ee184e","Type":"ContainerStarted","Data":"0ab5298705d69982dfe7202f370d5c1eb34a09bd7c6870837c52ce8e8e97f747"} Apr 16 15:06:17.590385 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:17.590383 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7j5f2" event={"ID":"a6f42147-3fde-4faa-b673-ce0560ee184e","Type":"ContainerStarted","Data":"d3d4d7ca99dfe835e51faea228b4c8b0ec36d10494d45bc3c61c1cea0bb4f6c0"} Apr 16 15:06:17.607227 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:17.607186 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-7j5f2" podStartSLOduration=1.6071739950000001 podStartE2EDuration="1.607173995s" podCreationTimestamp="2026-04-16 15:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:06:17.605581533 +0000 UTC m=+827.101908757" watchObservedRunningTime="2026-04-16 15:06:17.607173995 +0000 UTC m=+827.103501224" Apr 16 15:06:21.602872 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:21.602840 2582 generic.go:358] "Generic (PLEG): container finished" podID="a6f42147-3fde-4faa-b673-ce0560ee184e" containerID="0ab5298705d69982dfe7202f370d5c1eb34a09bd7c6870837c52ce8e8e97f747" exitCode=0 Apr 16 15:06:21.603191 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:21.602901 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7j5f2" event={"ID":"a6f42147-3fde-4faa-b673-ce0560ee184e","Type":"ContainerDied","Data":"0ab5298705d69982dfe7202f370d5c1eb34a09bd7c6870837c52ce8e8e97f747"} Apr 16 15:06:22.733229 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:22.733206 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7j5f2" Apr 16 15:06:22.831017 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:22.830991 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfnfw\" (UniqueName: \"kubernetes.io/projected/a6f42147-3fde-4faa-b673-ce0560ee184e-kube-api-access-qfnfw\") pod \"a6f42147-3fde-4faa-b673-ce0560ee184e\" (UID: \"a6f42147-3fde-4faa-b673-ce0560ee184e\") " Apr 16 15:06:22.833147 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:22.833117 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f42147-3fde-4faa-b673-ce0560ee184e-kube-api-access-qfnfw" (OuterVolumeSpecName: "kube-api-access-qfnfw") pod "a6f42147-3fde-4faa-b673-ce0560ee184e" (UID: "a6f42147-3fde-4faa-b673-ce0560ee184e"). InnerVolumeSpecName "kube-api-access-qfnfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:06:22.932282 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:22.932261 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qfnfw\" (UniqueName: \"kubernetes.io/projected/a6f42147-3fde-4faa-b673-ce0560ee184e-kube-api-access-qfnfw\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 15:06:23.610560 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:23.610535 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-7j5f2" Apr 16 15:06:23.610720 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:23.610531 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-7j5f2" event={"ID":"a6f42147-3fde-4faa-b673-ce0560ee184e","Type":"ContainerDied","Data":"d3d4d7ca99dfe835e51faea228b4c8b0ec36d10494d45bc3c61c1cea0bb4f6c0"} Apr 16 15:06:23.610720 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:23.610638 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3d4d7ca99dfe835e51faea228b4c8b0ec36d10494d45bc3c61c1cea0bb4f6c0" Apr 16 15:06:26.588999 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.588965 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-j7b6z"] Apr 16 15:06:26.589355 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.589247 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6f42147-3fde-4faa-b673-ce0560ee184e" containerName="s3-tls-init-custom" Apr 16 15:06:26.589355 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.589259 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f42147-3fde-4faa-b673-ce0560ee184e" containerName="s3-tls-init-custom" Apr 16 15:06:26.589355 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.589305 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6f42147-3fde-4faa-b673-ce0560ee184e" containerName="s3-tls-init-custom" Apr 16 15:06:26.592223 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.592206 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-j7b6z" Apr 16 15:06:26.594795 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.594777 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:06:26.595533 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.595518 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9zz8m\"" Apr 16 15:06:26.595622 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.595539 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 15:06:26.595622 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.595567 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:06:26.598331 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.598315 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-j7b6z"] Apr 16 15:06:26.658075 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.658050 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkcgh\" (UniqueName: \"kubernetes.io/projected/23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc-kube-api-access-qkcgh\") pod \"s3-tls-init-serving-j7b6z\" (UID: \"23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc\") " pod="kserve/s3-tls-init-serving-j7b6z" Apr 16 15:06:26.759140 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.759114 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkcgh\" (UniqueName: \"kubernetes.io/projected/23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc-kube-api-access-qkcgh\") pod \"s3-tls-init-serving-j7b6z\" (UID: \"23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc\") " pod="kserve/s3-tls-init-serving-j7b6z" Apr 16 15:06:26.767394 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.767368 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkcgh\" (UniqueName: \"kubernetes.io/projected/23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc-kube-api-access-qkcgh\") pod \"s3-tls-init-serving-j7b6z\" (UID: \"23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc\") " pod="kserve/s3-tls-init-serving-j7b6z" Apr 16 15:06:26.910591 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:26.910565 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-j7b6z" Apr 16 15:06:27.033083 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:27.033058 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-j7b6z"] Apr 16 15:06:27.035680 ip-10-0-139-101 kubenswrapper[2582]: W0416 15:06:27.035645 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23f5e5fa_c9d8_451e_8d54_5fcd2a7172fc.slice/crio-23333c94246e553fa69be7e1b409c03d62f3bf01e05e285d1ad195295319d2b6 WatchSource:0}: Error finding container 23333c94246e553fa69be7e1b409c03d62f3bf01e05e285d1ad195295319d2b6: Status 404 returned error can't find the container with id 23333c94246e553fa69be7e1b409c03d62f3bf01e05e285d1ad195295319d2b6 Apr 16 15:06:27.623376 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:27.623345 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-j7b6z" event={"ID":"23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc","Type":"ContainerStarted","Data":"b78f3bbf2c5a3cce0e051ded02864916d6510956cb40bfa118fdbc508bfacfc6"} Apr 16 15:06:27.623376 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:27.623380 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-j7b6z" event={"ID":"23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc","Type":"ContainerStarted","Data":"23333c94246e553fa69be7e1b409c03d62f3bf01e05e285d1ad195295319d2b6"} Apr 16 15:06:27.641234 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:27.641189 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-j7b6z" podStartSLOduration=1.6411757759999999 podStartE2EDuration="1.641175776s" podCreationTimestamp="2026-04-16 15:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:06:27.639113013 +0000 UTC m=+837.135440241" watchObservedRunningTime="2026-04-16 15:06:27.641175776 +0000 UTC m=+837.137503005" Apr 16 15:06:31.635984 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:31.635953 2582 generic.go:358] "Generic (PLEG): container finished" podID="23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc" containerID="b78f3bbf2c5a3cce0e051ded02864916d6510956cb40bfa118fdbc508bfacfc6" exitCode=0 Apr 16 15:06:31.636433 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:31.636033 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-j7b6z" event={"ID":"23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc","Type":"ContainerDied","Data":"b78f3bbf2c5a3cce0e051ded02864916d6510956cb40bfa118fdbc508bfacfc6"} Apr 16 15:06:32.768542 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:32.768519 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-j7b6z" Apr 16 15:06:32.805844 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:32.805806 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkcgh\" (UniqueName: \"kubernetes.io/projected/23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc-kube-api-access-qkcgh\") pod \"23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc\" (UID: \"23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc\") " Apr 16 15:06:32.807949 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:32.807926 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc-kube-api-access-qkcgh" (OuterVolumeSpecName: "kube-api-access-qkcgh") pod "23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc" (UID: "23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc"). InnerVolumeSpecName "kube-api-access-qkcgh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:06:32.906695 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:32.906637 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qkcgh\" (UniqueName: \"kubernetes.io/projected/23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc-kube-api-access-qkcgh\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 15:06:33.642932 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:33.642898 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-j7b6z" Apr 16 15:06:33.643161 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:33.642900 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-j7b6z" event={"ID":"23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc","Type":"ContainerDied","Data":"23333c94246e553fa69be7e1b409c03d62f3bf01e05e285d1ad195295319d2b6"} Apr 16 15:06:33.643161 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:06:33.643015 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23333c94246e553fa69be7e1b409c03d62f3bf01e05e285d1ad195295319d2b6" Apr 16 15:07:31.096389 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:07:31.096351 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:07:31.097222 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:07:31.097202 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:09:52.972036 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:52.971997 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5"] Apr 16 15:09:52.972590 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:52.972488 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc" containerName="s3-tls-init-serving" Apr 16 15:09:52.972590 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:52.972507 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc" containerName="s3-tls-init-serving" Apr 16 15:09:52.972726 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:52.972624 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc" containerName="s3-tls-init-serving" Apr 16 15:09:52.974539 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:52.974518 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" Apr 16 15:09:52.976768 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:52.976740 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:09:52.982556 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:52.982530 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5"] Apr 16 15:09:52.984362 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:52.984346 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" Apr 16 15:09:53.104412 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:53.104372 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5"] Apr 16 15:09:53.107320 ip-10-0-139-101 kubenswrapper[2582]: W0416 15:09:53.107295 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod870e025e_d9c5_438d_94c6_3525e3739d05.slice/crio-44182f4de20ea240d01ff8fc5c0317e4639215f45023ad81e7fc4804e1ba9da7 WatchSource:0}: Error finding container 44182f4de20ea240d01ff8fc5c0317e4639215f45023ad81e7fc4804e1ba9da7: Status 404 returned error can't find the container with id 44182f4de20ea240d01ff8fc5c0317e4639215f45023ad81e7fc4804e1ba9da7 Apr 16 15:09:53.109139 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:53.109123 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:09:53.204531 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:53.204505 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" event={"ID":"870e025e-d9c5-438d-94c6-3525e3739d05","Type":"ContainerStarted","Data":"44182f4de20ea240d01ff8fc5c0317e4639215f45023ad81e7fc4804e1ba9da7"} Apr 16 15:09:54.208355 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:54.208326 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" event={"ID":"870e025e-d9c5-438d-94c6-3525e3739d05","Type":"ContainerStarted","Data":"793fefd181fdf5ba949f523ffcbed6a2d1fb0007c430eabd56d4b6477b1a14b6"} Apr 16 15:09:54.208717 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:54.208629 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" Apr 16 15:09:54.209917 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:54.209896 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" podUID="870e025e-d9c5-438d-94c6-3525e3739d05" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 15:09:54.253928 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:54.253856 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" podStartSLOduration=1.3673667809999999 podStartE2EDuration="2.253843344s" podCreationTimestamp="2026-04-16 15:09:52 +0000 UTC" firstStartedPulling="2026-04-16 15:09:53.109271443 +0000 UTC m=+1042.605598649" lastFinishedPulling="2026-04-16 15:09:53.995748005 +0000 UTC m=+1043.492075212" observedRunningTime="2026-04-16 15:09:54.252849081 +0000 UTC m=+1043.749176302" watchObservedRunningTime="2026-04-16 15:09:54.253843344 +0000 UTC m=+1043.750170563" Apr 16 15:09:55.211392 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:09:55.211367 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" Apr 16 15:11:28.080351 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:28.080322 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-2ttq5_870e025e-d9c5-438d-94c6-3525e3739d05/kserve-container/0.log" Apr 16 15:11:28.348781 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:28.348710 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5"] Apr 16 15:11:28.348982 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:28.348960 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" podUID="870e025e-d9c5-438d-94c6-3525e3739d05" containerName="kserve-container" containerID="cri-o://793fefd181fdf5ba949f523ffcbed6a2d1fb0007c430eabd56d4b6477b1a14b6" gracePeriod=30 Apr 16 15:11:28.483341 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:28.483305 2582 generic.go:358] "Generic (PLEG): container finished" podID="870e025e-d9c5-438d-94c6-3525e3739d05" containerID="793fefd181fdf5ba949f523ffcbed6a2d1fb0007c430eabd56d4b6477b1a14b6" exitCode=2 Apr 16 15:11:28.483483 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:28.483371 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" event={"ID":"870e025e-d9c5-438d-94c6-3525e3739d05","Type":"ContainerDied","Data":"793fefd181fdf5ba949f523ffcbed6a2d1fb0007c430eabd56d4b6477b1a14b6"} Apr 16 15:11:28.585531 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:28.585509 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" Apr 16 15:11:29.487637 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:29.487547 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" event={"ID":"870e025e-d9c5-438d-94c6-3525e3739d05","Type":"ContainerDied","Data":"44182f4de20ea240d01ff8fc5c0317e4639215f45023ad81e7fc4804e1ba9da7"} Apr 16 15:11:29.487637 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:29.487591 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5" Apr 16 15:11:29.487637 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:29.487591 2582 scope.go:117] "RemoveContainer" containerID="793fefd181fdf5ba949f523ffcbed6a2d1fb0007c430eabd56d4b6477b1a14b6" Apr 16 15:11:29.502860 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:29.502816 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5"] Apr 16 15:11:29.506194 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:29.506174 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-2ttq5"] Apr 16 15:11:31.160979 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:11:31.160948 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870e025e-d9c5-438d-94c6-3525e3739d05" path="/var/lib/kubelet/pods/870e025e-d9c5-438d-94c6-3525e3739d05/volumes" Apr 16 15:12:31.126358 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:12:31.126331 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:12:31.127720 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:12:31.127698 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:17:31.146925 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:17:31.146900 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:17:31.148519 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:17:31.148499 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:22:31.168506 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:22:31.168464 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:22:31.172639 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:22:31.172618 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:27:31.188310 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:27:31.188279 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:27:31.193013 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:27:31.192995 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:32:31.209756 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:32:31.209727 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:32:31.219063 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:32:31.219046 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:37:31.230074 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:37:31.230006 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:37:31.242544 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:37:31.242516 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:42:31.255872 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:42:31.255837 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:42:31.270338 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:42:31.270317 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:47:31.275037 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:47:31.275011 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:47:31.290818 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:47:31.290798 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:52:31.294391 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:52:31.294366 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:52:31.311265 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:52:31.311242 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:57:31.312804 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:57:31.312778 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 15:57:31.331339 ip-10-0-139-101 kubenswrapper[2582]: I0416 15:57:31.331318 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 16:00:33.085238 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.085202 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5j4zl/must-gather-5skcm"] Apr 16 16:00:33.085625 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.085490 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="870e025e-d9c5-438d-94c6-3525e3739d05" containerName="kserve-container" Apr 16 16:00:33.085625 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.085500 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="870e025e-d9c5-438d-94c6-3525e3739d05" containerName="kserve-container" Apr 16 16:00:33.085625 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.085544 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="870e025e-d9c5-438d-94c6-3525e3739d05" containerName="kserve-container" Apr 16 16:00:33.088518 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.088501 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5j4zl/must-gather-5skcm" Apr 16 16:00:33.090783 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.090761 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5j4zl\"/\"openshift-service-ca.crt\"" Apr 16 16:00:33.091942 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.091920 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5j4zl\"/\"kube-root-ca.crt\"" Apr 16 16:00:33.092033 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.091931 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5j4zl\"/\"default-dockercfg-vx5rv\"" Apr 16 16:00:33.096880 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.096860 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5j4zl/must-gather-5skcm"] Apr 16 16:00:33.236951 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.236911 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ec954c5-530d-4a89-a83b-0e4790756992-must-gather-output\") pod \"must-gather-5skcm\" (UID: \"5ec954c5-530d-4a89-a83b-0e4790756992\") " pod="openshift-must-gather-5j4zl/must-gather-5skcm" Apr 16 16:00:33.237134 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.237115 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7mxr\" (UniqueName: \"kubernetes.io/projected/5ec954c5-530d-4a89-a83b-0e4790756992-kube-api-access-z7mxr\") pod \"must-gather-5skcm\" (UID: \"5ec954c5-530d-4a89-a83b-0e4790756992\") " pod="openshift-must-gather-5j4zl/must-gather-5skcm" Apr 16 16:00:33.337850 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.337761 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ec954c5-530d-4a89-a83b-0e4790756992-must-gather-output\") pod \"must-gather-5skcm\" (UID: \"5ec954c5-530d-4a89-a83b-0e4790756992\") " pod="openshift-must-gather-5j4zl/must-gather-5skcm" Apr 16 16:00:33.337850 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.337794 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7mxr\" (UniqueName: \"kubernetes.io/projected/5ec954c5-530d-4a89-a83b-0e4790756992-kube-api-access-z7mxr\") pod \"must-gather-5skcm\" (UID: \"5ec954c5-530d-4a89-a83b-0e4790756992\") " pod="openshift-must-gather-5j4zl/must-gather-5skcm" Apr 16 16:00:33.338080 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.338063 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ec954c5-530d-4a89-a83b-0e4790756992-must-gather-output\") pod \"must-gather-5skcm\" (UID: \"5ec954c5-530d-4a89-a83b-0e4790756992\") " pod="openshift-must-gather-5j4zl/must-gather-5skcm" Apr 16 16:00:33.346261 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.346232 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7mxr\" (UniqueName: \"kubernetes.io/projected/5ec954c5-530d-4a89-a83b-0e4790756992-kube-api-access-z7mxr\") pod \"must-gather-5skcm\" (UID: \"5ec954c5-530d-4a89-a83b-0e4790756992\") " pod="openshift-must-gather-5j4zl/must-gather-5skcm" Apr 16 16:00:33.407481 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.407453 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5j4zl/must-gather-5skcm" Apr 16 16:00:33.527325 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.527295 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5j4zl/must-gather-5skcm"] Apr 16 16:00:33.530179 ip-10-0-139-101 kubenswrapper[2582]: W0416 16:00:33.530149 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ec954c5_530d_4a89_a83b_0e4790756992.slice/crio-60416f4cd23c13438f2b8316465066d8ee3d490dbaf0b8bda5c1f275e1847e37 WatchSource:0}: Error finding container 60416f4cd23c13438f2b8316465066d8ee3d490dbaf0b8bda5c1f275e1847e37: Status 404 returned error can't find the container with id 60416f4cd23c13438f2b8316465066d8ee3d490dbaf0b8bda5c1f275e1847e37 Apr 16 16:00:33.531688 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.531673 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:00:33.791162 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:33.791127 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5j4zl/must-gather-5skcm" event={"ID":"5ec954c5-530d-4a89-a83b-0e4790756992","Type":"ContainerStarted","Data":"60416f4cd23c13438f2b8316465066d8ee3d490dbaf0b8bda5c1f275e1847e37"} Apr 16 16:00:38.812855 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:38.812018 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5j4zl/must-gather-5skcm" event={"ID":"5ec954c5-530d-4a89-a83b-0e4790756992","Type":"ContainerStarted","Data":"9b0a73215e936abdd49579479a46a38c4beb0ce9b93ca1f356cc9b877361d8bb"} Apr 16 16:00:38.812855 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:38.812064 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5j4zl/must-gather-5skcm" event={"ID":"5ec954c5-530d-4a89-a83b-0e4790756992","Type":"ContainerStarted","Data":"130614ad4637c776a725aa0ca262e711487ea890a053536f4490711e40d4fa3b"} Apr 16 16:00:38.831595 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:38.831537 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5j4zl/must-gather-5skcm" podStartSLOduration=1.449585673 podStartE2EDuration="5.831518807s" podCreationTimestamp="2026-04-16 16:00:33 +0000 UTC" firstStartedPulling="2026-04-16 16:00:33.531840065 +0000 UTC m=+4083.028167285" lastFinishedPulling="2026-04-16 16:00:37.913773212 +0000 UTC m=+4087.410100419" observedRunningTime="2026-04-16 16:00:38.829213938 +0000 UTC m=+4088.325541167" watchObservedRunningTime="2026-04-16 16:00:38.831518807 +0000 UTC m=+4088.327846036" Apr 16 16:00:57.866785 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:57.866749 2582 generic.go:358] "Generic (PLEG): container finished" podID="5ec954c5-530d-4a89-a83b-0e4790756992" containerID="130614ad4637c776a725aa0ca262e711487ea890a053536f4490711e40d4fa3b" exitCode=0 Apr 16 16:00:57.867220 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:57.866801 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5j4zl/must-gather-5skcm" event={"ID":"5ec954c5-530d-4a89-a83b-0e4790756992","Type":"ContainerDied","Data":"130614ad4637c776a725aa0ca262e711487ea890a053536f4490711e40d4fa3b"} Apr 16 16:00:57.867220 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:57.867099 2582 scope.go:117] "RemoveContainer" containerID="130614ad4637c776a725aa0ca262e711487ea890a053536f4490711e40d4fa3b" Apr 16 16:00:58.122646 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.122565 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5j4zl_must-gather-5skcm_5ec954c5-530d-4a89-a83b-0e4790756992/gather/0.log" Apr 16 16:00:58.745283 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.745247 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gvdx8/must-gather-lfwbt"] Apr 16 16:00:58.748619 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.748602 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdx8/must-gather-lfwbt" Apr 16 16:00:58.751069 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.751050 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gvdx8\"/\"kube-root-ca.crt\"" Apr 16 16:00:58.751149 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.751054 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gvdx8\"/\"openshift-service-ca.crt\"" Apr 16 16:00:58.751923 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.751893 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gvdx8\"/\"default-dockercfg-sldsr\"" Apr 16 16:00:58.755681 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.755658 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvdx8/must-gather-lfwbt"] Apr 16 16:00:58.838176 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.838153 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrjmx\" (UniqueName: \"kubernetes.io/projected/2a6674f9-4746-4a01-ac24-21ac52218542-kube-api-access-lrjmx\") pod \"must-gather-lfwbt\" (UID: \"2a6674f9-4746-4a01-ac24-21ac52218542\") " pod="openshift-must-gather-gvdx8/must-gather-lfwbt" Apr 16 16:00:58.838278 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.838191 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a6674f9-4746-4a01-ac24-21ac52218542-must-gather-output\") pod \"must-gather-lfwbt\" (UID: \"2a6674f9-4746-4a01-ac24-21ac52218542\") " pod="openshift-must-gather-gvdx8/must-gather-lfwbt" Apr 16 16:00:58.939367 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.939347 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrjmx\" (UniqueName: \"kubernetes.io/projected/2a6674f9-4746-4a01-ac24-21ac52218542-kube-api-access-lrjmx\") pod \"must-gather-lfwbt\" (UID: \"2a6674f9-4746-4a01-ac24-21ac52218542\") " pod="openshift-must-gather-gvdx8/must-gather-lfwbt" Apr 16 16:00:58.939781 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.939382 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a6674f9-4746-4a01-ac24-21ac52218542-must-gather-output\") pod \"must-gather-lfwbt\" (UID: \"2a6674f9-4746-4a01-ac24-21ac52218542\") " pod="openshift-must-gather-gvdx8/must-gather-lfwbt" Apr 16 16:00:58.939781 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.939630 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a6674f9-4746-4a01-ac24-21ac52218542-must-gather-output\") pod \"must-gather-lfwbt\" (UID: \"2a6674f9-4746-4a01-ac24-21ac52218542\") " pod="openshift-must-gather-gvdx8/must-gather-lfwbt" Apr 16 16:00:58.947095 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:58.947066 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrjmx\" (UniqueName: \"kubernetes.io/projected/2a6674f9-4746-4a01-ac24-21ac52218542-kube-api-access-lrjmx\") pod \"must-gather-lfwbt\" (UID: \"2a6674f9-4746-4a01-ac24-21ac52218542\") " pod="openshift-must-gather-gvdx8/must-gather-lfwbt" Apr 16 16:00:59.057941 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:59.057889 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdx8/must-gather-lfwbt" Apr 16 16:00:59.173780 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:59.173742 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvdx8/must-gather-lfwbt"] Apr 16 16:00:59.176699 ip-10-0-139-101 kubenswrapper[2582]: W0416 16:00:59.176671 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a6674f9_4746_4a01_ac24_21ac52218542.slice/crio-42d7d7c82795ca377bd76d0b09dc1614be8a8d0faf6f0108b704e94467a32b5f WatchSource:0}: Error finding container 42d7d7c82795ca377bd76d0b09dc1614be8a8d0faf6f0108b704e94467a32b5f: Status 404 returned error can't find the container with id 42d7d7c82795ca377bd76d0b09dc1614be8a8d0faf6f0108b704e94467a32b5f Apr 16 16:00:59.874092 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:00:59.874058 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdx8/must-gather-lfwbt" event={"ID":"2a6674f9-4746-4a01-ac24-21ac52218542","Type":"ContainerStarted","Data":"42d7d7c82795ca377bd76d0b09dc1614be8a8d0faf6f0108b704e94467a32b5f"} Apr 16 16:01:00.879420 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:00.879384 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdx8/must-gather-lfwbt" event={"ID":"2a6674f9-4746-4a01-ac24-21ac52218542","Type":"ContainerStarted","Data":"9473733d0b48f85e233aa753168cc7f18093d13290b35782a73f34f8f185218d"} Apr 16 16:01:00.879420 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:00.879424 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdx8/must-gather-lfwbt" event={"ID":"2a6674f9-4746-4a01-ac24-21ac52218542","Type":"ContainerStarted","Data":"da3c6f46d6ec0a70cf4e9448a548464edc4b42e146f5880b836aeb949858c46a"} Apr 16 16:01:00.893633 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:00.893572 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gvdx8/must-gather-lfwbt" podStartSLOduration=2.195300031 podStartE2EDuration="2.893553395s" podCreationTimestamp="2026-04-16 16:00:58 +0000 UTC" firstStartedPulling="2026-04-16 16:00:59.178462944 +0000 UTC m=+4108.674790151" lastFinishedPulling="2026-04-16 16:00:59.876716306 +0000 UTC m=+4109.373043515" observedRunningTime="2026-04-16 16:01:00.892495434 +0000 UTC m=+4110.388822663" watchObservedRunningTime="2026-04-16 16:01:00.893553395 +0000 UTC m=+4110.389880626" Apr 16 16:01:01.276401 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:01.276304 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nz48r_ada98a0b-3e12-4c80-9534-e61848c60c06/global-pull-secret-syncer/0.log" Apr 16 16:01:01.437811 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:01.437778 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xnxqc_16a896b7-8655-40cd-b495-e93e72c07fb6/konnectivity-agent/0.log" Apr 16 16:01:01.487526 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:01.487500 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-101.ec2.internal_57029cb5a3dfbbc5e89140d02a0a24ef/haproxy/0.log" Apr 16 16:01:03.608193 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:03.608151 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5j4zl/must-gather-5skcm"] Apr 16 16:01:03.608703 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:03.608431 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-5j4zl/must-gather-5skcm" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" containerName="copy" containerID="cri-o://9b0a73215e936abdd49579479a46a38c4beb0ce9b93ca1f356cc9b877361d8bb" gracePeriod=2 Apr 16 16:01:03.610781 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:03.610557 2582 status_manager.go:895] "Failed to get status for pod" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" pod="openshift-must-gather-5j4zl/must-gather-5skcm" err="pods \"must-gather-5skcm\" is forbidden: User \"system:node:ip-10-0-139-101.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5j4zl\": no relationship found between node 'ip-10-0-139-101.ec2.internal' and this object" Apr 16 16:01:03.611503 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:03.611471 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5j4zl/must-gather-5skcm"] Apr 16 16:01:03.904436 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:03.903979 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5j4zl_must-gather-5skcm_5ec954c5-530d-4a89-a83b-0e4790756992/copy/0.log" Apr 16 16:01:03.904669 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:03.904463 2582 generic.go:358] "Generic (PLEG): container finished" podID="5ec954c5-530d-4a89-a83b-0e4790756992" containerID="9b0a73215e936abdd49579479a46a38c4beb0ce9b93ca1f356cc9b877361d8bb" exitCode=143 Apr 16 16:01:03.997844 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:03.995532 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5j4zl_must-gather-5skcm_5ec954c5-530d-4a89-a83b-0e4790756992/copy/0.log" Apr 16 16:01:03.997844 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:03.995911 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5j4zl/must-gather-5skcm" Apr 16 16:01:03.998036 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:03.997960 2582 status_manager.go:895] "Failed to get status for pod" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" pod="openshift-must-gather-5j4zl/must-gather-5skcm" err="pods \"must-gather-5skcm\" is forbidden: User \"system:node:ip-10-0-139-101.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5j4zl\": no relationship found between node 'ip-10-0-139-101.ec2.internal' and this object" Apr 16 16:01:04.087158 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.087089 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ec954c5-530d-4a89-a83b-0e4790756992-must-gather-output\") pod \"5ec954c5-530d-4a89-a83b-0e4790756992\" (UID: \"5ec954c5-530d-4a89-a83b-0e4790756992\") " Apr 16 16:01:04.087491 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.087437 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7mxr\" (UniqueName: \"kubernetes.io/projected/5ec954c5-530d-4a89-a83b-0e4790756992-kube-api-access-z7mxr\") pod \"5ec954c5-530d-4a89-a83b-0e4790756992\" (UID: \"5ec954c5-530d-4a89-a83b-0e4790756992\") " Apr 16 16:01:04.090464 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.090436 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec954c5-530d-4a89-a83b-0e4790756992-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5ec954c5-530d-4a89-a83b-0e4790756992" (UID: "5ec954c5-530d-4a89-a83b-0e4790756992"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:01:04.091578 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.091553 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec954c5-530d-4a89-a83b-0e4790756992-kube-api-access-z7mxr" (OuterVolumeSpecName: "kube-api-access-z7mxr") pod "5ec954c5-530d-4a89-a83b-0e4790756992" (UID: "5ec954c5-530d-4a89-a83b-0e4790756992"). InnerVolumeSpecName "kube-api-access-z7mxr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:01:04.188642 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.188556 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z7mxr\" (UniqueName: \"kubernetes.io/projected/5ec954c5-530d-4a89-a83b-0e4790756992-kube-api-access-z7mxr\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 16:01:04.188897 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.188876 2582 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ec954c5-530d-4a89-a83b-0e4790756992-must-gather-output\") on node \"ip-10-0-139-101.ec2.internal\" DevicePath \"\"" Apr 16 16:01:04.909962 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.909928 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5j4zl_must-gather-5skcm_5ec954c5-530d-4a89-a83b-0e4790756992/copy/0.log" Apr 16 16:01:04.910544 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.910460 2582 scope.go:117] "RemoveContainer" containerID="9b0a73215e936abdd49579479a46a38c4beb0ce9b93ca1f356cc9b877361d8bb" Apr 16 16:01:04.910607 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.910465 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5j4zl/must-gather-5skcm" Apr 16 16:01:04.913183 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.913149 2582 status_manager.go:895] "Failed to get status for pod" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" pod="openshift-must-gather-5j4zl/must-gather-5skcm" err="pods \"must-gather-5skcm\" is forbidden: User \"system:node:ip-10-0-139-101.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5j4zl\": no relationship found between node 'ip-10-0-139-101.ec2.internal' and this object" Apr 16 16:01:04.924194 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.923799 2582 scope.go:117] "RemoveContainer" containerID="130614ad4637c776a725aa0ca262e711487ea890a053536f4490711e40d4fa3b" Apr 16 16:01:04.927058 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:04.927006 2582 status_manager.go:895] "Failed to get status for pod" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" pod="openshift-must-gather-5j4zl/must-gather-5skcm" err="pods \"must-gather-5skcm\" is forbidden: User \"system:node:ip-10-0-139-101.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5j4zl\": no relationship found between node 'ip-10-0-139-101.ec2.internal' and this object" Apr 16 16:01:05.143639 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.143573 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-685x7_22bd709d-b5a6-4831-ab98-87a2c0622bbe/node-exporter/0.log" Apr 16 16:01:05.162174 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.162092 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-685x7_22bd709d-b5a6-4831-ab98-87a2c0622bbe/kube-rbac-proxy/0.log" Apr 16 16:01:05.165868 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.165812 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" path="/var/lib/kubelet/pods/5ec954c5-530d-4a89-a83b-0e4790756992/volumes" Apr 16 16:01:05.187921 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.187892 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-685x7_22bd709d-b5a6-4831-ab98-87a2c0622bbe/init-textfile/0.log" Apr 16 16:01:05.492214 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.492140 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c980a658-7fcd-4639-b0a5-28908f804d8a/prometheus/0.log" Apr 16 16:01:05.508196 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.508160 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c980a658-7fcd-4639-b0a5-28908f804d8a/config-reloader/0.log" Apr 16 16:01:05.527466 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.527435 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c980a658-7fcd-4639-b0a5-28908f804d8a/thanos-sidecar/0.log" Apr 16 16:01:05.547312 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.547282 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c980a658-7fcd-4639-b0a5-28908f804d8a/kube-rbac-proxy-web/0.log" Apr 16 16:01:05.567692 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.567659 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c980a658-7fcd-4639-b0a5-28908f804d8a/kube-rbac-proxy/0.log" Apr 16 16:01:05.587241 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.587203 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c980a658-7fcd-4639-b0a5-28908f804d8a/kube-rbac-proxy-thanos/0.log" Apr 16 16:01:05.611055 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.611017 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c980a658-7fcd-4639-b0a5-28908f804d8a/init-config-reloader/0.log" Apr 16 16:01:05.643703 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.643662 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-4jngz_0950cb6b-a1b8-4cad-bc60-0314e46475f0/prometheus-operator/0.log" Apr 16 16:01:05.660724 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.660694 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-4jngz_0950cb6b-a1b8-4cad-bc60-0314e46475f0/kube-rbac-proxy/0.log" Apr 16 16:01:05.683905 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:05.683872 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-6j5d5_9945fe1b-a2b6-466f-87ce-1506f00c87fe/prometheus-operator-admission-webhook/0.log" Apr 16 16:01:07.813091 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:07.813040 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66dd85d545-vp9vx_9681bf36-ced1-49db-b513-9136b424b1c4/console/0.log" Apr 16 16:01:07.858529 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:07.858499 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-2brqv_20a533bc-cfb1-4fba-afea-c9f0e370d07c/download-server/0.log" Apr 16 16:01:08.290488 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.288238 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52"] Apr 16 16:01:08.290488 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.288923 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" containerName="gather" Apr 16 16:01:08.290488 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.288941 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" containerName="gather" Apr 16 16:01:08.290488 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.288984 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" containerName="copy" Apr 16 16:01:08.290488 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.288992 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" containerName="copy" Apr 16 16:01:08.290488 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.289132 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" containerName="gather" Apr 16 16:01:08.290488 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.289146 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ec954c5-530d-4a89-a83b-0e4790756992" containerName="copy" Apr 16 16:01:08.294358 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.293817 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.296937 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.296914 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52"] Apr 16 16:01:08.327703 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.327673 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-proc\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.327882 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.327735 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-podres\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.327882 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.327789 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqt87\" (UniqueName: \"kubernetes.io/projected/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-kube-api-access-pqt87\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.327882 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.327872 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-lib-modules\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.328059 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.327916 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-sys\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.428348 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.428312 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-lib-modules\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.428518 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.428364 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-sys\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.428518 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.428399 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-proc\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.428518 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.428433 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-podres\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.428518 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.428470 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqt87\" (UniqueName: \"kubernetes.io/projected/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-kube-api-access-pqt87\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.428518 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.428487 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-lib-modules\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.428518 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.428495 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-sys\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.428866 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.428532 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-proc\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.428866 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.428547 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-podres\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.436160 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.436133 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqt87\" (UniqueName: \"kubernetes.io/projected/f4a2065b-ec1d-4d2b-8b84-31d983c1b554-kube-api-access-pqt87\") pod \"perf-node-gather-daemonset-cjj52\" (UID: \"f4a2065b-ec1d-4d2b-8b84-31d983c1b554\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.609900 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.609795 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.745566 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.745418 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52"] Apr 16 16:01:08.751376 ip-10-0-139-101 kubenswrapper[2582]: W0416 16:01:08.749699 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf4a2065b_ec1d_4d2b_8b84_31d983c1b554.slice/crio-3ff668869d2fdfce5ed4ae8a9cab099c9ea06bc2c88bfa37d1c3700780487a5b WatchSource:0}: Error finding container 3ff668869d2fdfce5ed4ae8a9cab099c9ea06bc2c88bfa37d1c3700780487a5b: Status 404 returned error can't find the container with id 3ff668869d2fdfce5ed4ae8a9cab099c9ea06bc2c88bfa37d1c3700780487a5b Apr 16 16:01:08.871903 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.871838 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8pv4t_4bcbf238-7f5c-47ac-b42a-0e299ed29df0/dns/0.log" Apr 16 16:01:08.889034 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.889010 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8pv4t_4bcbf238-7f5c-47ac-b42a-0e299ed29df0/kube-rbac-proxy/0.log" Apr 16 16:01:08.925622 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.925591 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" event={"ID":"f4a2065b-ec1d-4d2b-8b84-31d983c1b554","Type":"ContainerStarted","Data":"2feec9df1c1ac601fff137d5b09d192f41ff9213fe102faac53b95a180cb5414"} Apr 16 16:01:08.925725 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.925622 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" event={"ID":"f4a2065b-ec1d-4d2b-8b84-31d983c1b554","Type":"ContainerStarted","Data":"3ff668869d2fdfce5ed4ae8a9cab099c9ea06bc2c88bfa37d1c3700780487a5b"} Apr 16 16:01:08.925725 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.925709 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:08.941043 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:08.941004 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" podStartSLOduration=0.940991542 podStartE2EDuration="940.991542ms" podCreationTimestamp="2026-04-16 16:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:01:08.939298347 +0000 UTC m=+4118.435625576" watchObservedRunningTime="2026-04-16 16:01:08.940991542 +0000 UTC m=+4118.437318772" Apr 16 16:01:09.030521 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:09.030494 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-z8k77_d26e8a35-54b1-4862-87ad-cab47e12e62d/dns-node-resolver/0.log" Apr 16 16:01:09.473168 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:09.473140 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-t4w4v_6157833b-8b66-48ab-a248-7d79d51cec48/node-ca/0.log" Apr 16 16:01:10.115931 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:10.115902 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-b9d79f856-c56q2_5b6bb918-ac04-4256-9242-dd810c7e754e/router/0.log" Apr 16 16:01:10.427235 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:10.427212 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jh4sv_87ff3a2a-3409-4acc-8192-f4db952ccdcf/serve-healthcheck-canary/0.log" Apr 16 16:01:10.783953 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:10.783858 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-n7xxg_605f32ed-53b6-48df-8568-937ded360dd9/insights-operator/0.log" Apr 16 16:01:10.786212 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:10.786181 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-n7xxg_605f32ed-53b6-48df-8568-937ded360dd9/insights-operator/1.log" Apr 16 16:01:10.862159 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:10.862133 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mh42f_141f077c-7b80-4110-8cea-be4a0d2339c7/kube-rbac-proxy/0.log" Apr 16 16:01:10.881079 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:10.881055 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mh42f_141f077c-7b80-4110-8cea-be4a0d2339c7/exporter/0.log" Apr 16 16:01:10.909660 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:10.909632 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mh42f_141f077c-7b80-4110-8cea-be4a0d2339c7/extractor/0.log" Apr 16 16:01:13.286076 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:13.286028 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-sv99g_33eee4bc-1e4b-479d-921f-cab6fecdb99f/s3-init/0.log" Apr 16 16:01:13.306611 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:13.306588 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-7j5f2_a6f42147-3fde-4faa-b673-ce0560ee184e/s3-tls-init-custom/0.log" Apr 16 16:01:13.326273 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:13.326251 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-j7b6z_23f5e5fa-c9d8-451e-8d54-5fcd2a7172fc/s3-tls-init-serving/0.log" Apr 16 16:01:14.939090 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:14.939063 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-cjj52" Apr 16 16:01:17.373602 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:17.373493 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-nssxt_c8cffb83-efd8-43b7-9c39-bf80135be6c8/migrator/0.log" Apr 16 16:01:17.392089 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:17.392061 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-nssxt_c8cffb83-efd8-43b7-9c39-bf80135be6c8/graceful-termination/0.log" Apr 16 16:01:18.792506 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:18.792476 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2w5z_842ebc47-2cf2-4818-ab9a-d2f17f00f1da/kube-multus-additional-cni-plugins/0.log" Apr 16 16:01:18.812205 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:18.812180 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2w5z_842ebc47-2cf2-4818-ab9a-d2f17f00f1da/egress-router-binary-copy/0.log" Apr 16 16:01:18.832345 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:18.832264 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2w5z_842ebc47-2cf2-4818-ab9a-d2f17f00f1da/cni-plugins/0.log" Apr 16 16:01:18.849960 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:18.849937 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2w5z_842ebc47-2cf2-4818-ab9a-d2f17f00f1da/bond-cni-plugin/0.log" Apr 16 16:01:18.866886 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:18.866843 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2w5z_842ebc47-2cf2-4818-ab9a-d2f17f00f1da/routeoverride-cni/0.log" Apr 16 16:01:18.884886 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:18.884860 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2w5z_842ebc47-2cf2-4818-ab9a-d2f17f00f1da/whereabouts-cni-bincopy/0.log" Apr 16 16:01:18.902651 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:18.902629 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2w5z_842ebc47-2cf2-4818-ab9a-d2f17f00f1da/whereabouts-cni/0.log" Apr 16 16:01:19.147632 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:19.147603 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dm6rx_e0e9dc65-caa5-437c-b911-96a930ff75fe/kube-multus/0.log" Apr 16 16:01:19.193649 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:19.193622 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6ltjv_3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e/network-metrics-daemon/0.log" Apr 16 16:01:19.211068 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:19.211035 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6ltjv_3062c2d2-b0a0-4572-ab0f-7dfd8c8df85e/kube-rbac-proxy/0.log" Apr 16 16:01:20.620268 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:20.620229 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-controller/0.log" Apr 16 16:01:20.635053 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:20.634981 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/0.log" Apr 16 16:01:20.672913 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:20.672885 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovn-acl-logging/1.log" Apr 16 16:01:20.693844 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:20.693808 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/kube-rbac-proxy-node/0.log" Apr 16 16:01:20.716782 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:20.716764 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 16:01:20.733138 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:20.733099 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/northd/0.log" Apr 16 16:01:20.750960 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:20.750922 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/nbdb/0.log" Apr 16 16:01:20.768100 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:20.768078 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/sbdb/0.log" Apr 16 16:01:20.967645 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:20.967611 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fzdmh_08f2c22e-77d0-4250-843e-95a65b09af16/ovnkube-controller/0.log" Apr 16 16:01:21.945901 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:21.945869 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-s7ggn_7a22f03f-cc7f-4423-a17c-e15045b60dbe/check-endpoints/0.log" Apr 16 16:01:22.003675 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:22.003649 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zkgj2_8c2c36da-9263-4f56-8f34-dd26c0ce00c9/network-check-target-container/0.log" Apr 16 16:01:22.843525 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:22.843496 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-6xl5n_e4406ba0-59cd-4412-bdfe-3284d83e48a7/iptables-alerter/0.log" Apr 16 16:01:23.474737 ip-10-0-139-101 kubenswrapper[2582]: I0416 16:01:23.474702 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-42wsp_e0b17a8e-2ea7-490f-8ad4-c7ab6d453eea/tuned/0.log"