Apr 17 14:04:42.183495 ip-10-0-138-158 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 14:04:42.183506 ip-10-0-138-158 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 14:04:42.183513 ip-10-0-138-158 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 14:04:42.183723 ip-10-0-138-158 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 14:04:52.313064 ip-10-0-138-158 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 14:04:52.313084 ip-10-0-138-158 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot bf7a2486082d40d59552fe1eaa683700 -- Apr 17 14:07:36.244809 ip-10-0-138-158 systemd[1]: Starting Kubernetes Kubelet... Apr 17 14:07:36.694288 ip-10-0-138-158 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:07:36.694288 ip-10-0-138-158 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 14:07:36.694288 ip-10-0-138-158 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:07:36.694288 ip-10-0-138-158 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 14:07:36.694288 ip-10-0-138-158 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:07:36.696329 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.696233 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 14:07:36.698671 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698649 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:07:36.698671 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698666 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:07:36.698671 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698671 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:07:36.698671 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698675 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:07:36.698671 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698679 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698684 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698688 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698693 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698697 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698702 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698707 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698711 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698715 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698719 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698723 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698727 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698731 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698734 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698738 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698742 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698747 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698751 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698755 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:07:36.698951 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698759 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698770 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698774 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698778 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698783 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698787 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698791 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698795 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698800 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698804 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698808 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698813 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698817 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698822 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698826 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698831 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698835 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698840 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698846 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698850 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:07:36.699775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698854 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698858 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698862 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698867 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698871 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698875 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698880 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698885 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698889 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698894 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698898 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698902 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698906 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698910 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698914 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698918 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698922 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698926 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698931 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698935 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:07:36.700465 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698939 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698943 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698950 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698956 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698961 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698965 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698971 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698976 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698982 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.698998 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699005 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699010 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699014 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699019 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699023 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699027 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699031 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699035 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699040 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699044 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:07:36.701042 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699048 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699052 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699056 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699715 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699725 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699731 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699736 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699740 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699744 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699749 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699753 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699758 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699762 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699766 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699771 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699775 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699780 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699784 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699788 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699793 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:07:36.701558 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699797 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699802 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699807 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699813 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699820 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699824 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699829 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699834 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699838 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699844 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699850 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699856 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699860 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699864 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699868 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699873 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699878 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699884 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699888 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:07:36.702301 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699892 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699896 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699901 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699905 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699910 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699914 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699918 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699922 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699926 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699930 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699934 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699938 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699942 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699946 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699950 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699955 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699960 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699964 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699968 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699972 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:07:36.703181 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699976 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699980 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699983 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699988 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699992 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.699996 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700000 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700004 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700009 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700013 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700017 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700022 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700026 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700030 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700035 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700040 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700045 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700050 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700054 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700059 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:07:36.703694 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700063 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700067 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700071 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700076 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700081 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700085 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700089 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700093 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700098 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.700103 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701721 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701740 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701759 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701766 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701773 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701779 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701787 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701794 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701799 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701805 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701812 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 14:07:36.704279 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701817 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701823 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701828 2568 flags.go:64] FLAG: --cgroup-root="" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701833 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701837 2568 flags.go:64] FLAG: --client-ca-file="" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701842 2568 flags.go:64] FLAG: --cloud-config="" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701847 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701852 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701859 2568 flags.go:64] FLAG: --cluster-domain="" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701864 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701870 2568 flags.go:64] FLAG: --config-dir="" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701874 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701880 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701888 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701893 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701898 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701903 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701909 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701913 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701918 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701924 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701929 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701936 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701941 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701945 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 14:07:36.704953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701950 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701956 2568 flags.go:64] FLAG: --enable-server="true" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701961 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701969 2568 flags.go:64] FLAG: --event-burst="100" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701974 2568 flags.go:64] FLAG: --event-qps="50" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701979 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701984 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701990 2568 flags.go:64] FLAG: --eviction-hard="" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.701996 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702001 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702006 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702011 2568 flags.go:64] FLAG: --eviction-soft="" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702016 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702021 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702026 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702031 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702037 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702042 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702047 2568 flags.go:64] FLAG: --feature-gates="" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702053 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702058 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702063 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702068 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702073 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702079 2568 flags.go:64] FLAG: --help="false" Apr 17 14:07:36.705698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702084 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-138-158.ec2.internal" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702089 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702094 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702100 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702106 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702112 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702117 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702121 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702126 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702131 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702136 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702141 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702146 2568 flags.go:64] FLAG: --kube-reserved="" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702152 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702156 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702161 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702166 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702171 2568 flags.go:64] FLAG: --lock-file="" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702175 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702180 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702185 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702194 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702200 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702204 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 14:07:36.706393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702209 2568 flags.go:64] FLAG: --logging-format="text" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702214 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702219 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702224 2568 flags.go:64] FLAG: --manifest-url="" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702229 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702236 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702241 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702248 2568 flags.go:64] FLAG: --max-pods="110" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702252 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702258 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702262 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702267 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702273 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702278 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702283 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702296 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702300 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702305 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702310 2568 flags.go:64] FLAG: --pod-cidr="" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702315 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702324 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702328 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702334 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702338 2568 flags.go:64] FLAG: --port="10250" Apr 17 14:07:36.706986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702343 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702348 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02eea1f55856caaf2" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702353 2568 flags.go:64] FLAG: --qos-reserved="" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702358 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702363 2568 flags.go:64] FLAG: --register-node="true" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702371 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702376 2568 flags.go:64] FLAG: --register-with-taints="" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702382 2568 flags.go:64] FLAG: --registry-burst="10" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702386 2568 flags.go:64] FLAG: --registry-qps="5" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702391 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702396 2568 flags.go:64] FLAG: --reserved-memory="" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702402 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702406 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702412 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702416 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702421 2568 flags.go:64] FLAG: --runonce="false" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702426 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702431 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702436 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702441 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702446 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702452 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702457 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702462 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702467 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702471 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 14:07:36.707613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702477 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702481 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702486 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702491 2568 flags.go:64] FLAG: --system-cgroups="" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702496 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702520 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702524 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702529 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702535 2568 flags.go:64] FLAG: --tls-min-version="" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702540 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702544 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702551 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702556 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702561 2568 flags.go:64] FLAG: --v="2" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702568 2568 flags.go:64] FLAG: --version="false" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702574 2568 flags.go:64] FLAG: --vmodule="" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702581 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.702586 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702736 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702744 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702749 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702754 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702758 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702763 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:07:36.708229 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702767 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702771 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702776 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702781 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702786 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702790 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702795 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702799 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702803 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702807 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702811 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702816 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702821 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702825 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702829 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702833 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702837 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702842 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702846 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:07:36.708865 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702852 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702856 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702860 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702864 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702869 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702873 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702877 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702881 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702886 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702890 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702894 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702898 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702902 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702906 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702911 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702915 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702921 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702925 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702930 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702935 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:07:36.709386 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702939 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702943 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702947 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702951 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702956 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702960 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702964 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702968 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702972 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702976 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702980 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702984 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702990 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.702997 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703003 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703008 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703012 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703017 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703023 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:07:36.709915 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703029 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703034 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703039 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703043 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703048 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703052 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703056 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703060 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703064 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703071 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703075 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703079 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703084 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703089 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703093 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703097 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703101 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703105 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703109 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703113 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:07:36.710387 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703118 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:07:36.710882 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.703122 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:07:36.710882 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.703774 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:07:36.711306 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.711287 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 14:07:36.711334 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.711307 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 14:07:36.711378 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711369 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:07:36.711378 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711378 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711381 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711384 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711387 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711390 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711392 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711395 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711398 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711400 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711403 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711406 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711409 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711424 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711427 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711430 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711434 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711437 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:07:36.711434 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711440 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711444 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711447 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711451 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711453 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711456 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711459 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711462 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711465 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711468 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711470 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711473 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711475 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711478 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711480 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711483 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711485 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711488 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711490 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:07:36.711877 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711493 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711495 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711498 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711500 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711517 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711519 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711522 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711524 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711527 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711530 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711532 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711535 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711538 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711540 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711543 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711548 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711551 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711553 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711556 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711559 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:07:36.712348 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711561 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711564 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711566 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711569 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711572 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711574 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711577 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711579 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711582 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711586 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711590 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711593 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711595 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711598 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711601 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711604 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711607 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711610 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711612 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:07:36.712902 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711615 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711619 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711623 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711626 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711629 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711631 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711634 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711637 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711640 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711644 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.711650 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711746 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711751 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711754 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711757 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:07:36.713361 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711760 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711763 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711765 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711769 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711773 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711776 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711779 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711781 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711784 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711786 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711789 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711792 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711794 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711797 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711799 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711802 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711804 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711820 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711824 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:07:36.713764 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711827 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711830 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711833 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711835 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711838 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711841 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711844 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711847 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711851 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711854 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711857 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711859 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711862 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711865 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711868 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711870 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711873 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711876 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711879 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711881 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:07:36.714217 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711884 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711887 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711889 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711892 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711895 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711898 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711900 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711903 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711905 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711908 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711911 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711913 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711916 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711918 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711921 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711923 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711925 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711928 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711931 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711934 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:07:36.714725 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711936 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711940 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711942 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711945 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711947 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711950 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711952 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711955 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711957 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711960 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711963 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711966 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711968 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711971 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711973 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711976 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711979 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711982 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711984 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711988 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:07:36.715211 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711991 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:07:36.715775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711994 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:07:36.715775 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:36.711997 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:07:36.715775 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.712001 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:07:36.715775 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.712772 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 14:07:36.715775 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.715424 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 14:07:36.716447 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.716436 2568 server.go:1019] "Starting client certificate rotation" Apr 17 14:07:36.716558 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.716532 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:07:36.716656 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.716573 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:07:36.742268 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.742252 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:07:36.748418 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.748399 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:07:36.769042 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.769023 2568 log.go:25] "Validated CRI v1 runtime API" Apr 17 14:07:36.774697 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.774681 2568 log.go:25] "Validated CRI v1 image API" Apr 17 14:07:36.775953 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.775937 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 14:07:36.777075 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.777053 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:07:36.778390 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.778371 2568 fs.go:135] Filesystem UUIDs: map[4ee22a7c-c31c-443d-8b7c-3aab6699d7cb:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f226f477-c023-4925-9dc9-4e56a8770f6d:/dev/nvme0n1p3] Apr 17 14:07:36.778452 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.778389 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 14:07:36.784131 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.784029 2568 manager.go:217] Machine: {Timestamp:2026-04-17 14:07:36.782008002 +0000 UTC m=+0.418585394 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100634 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec217e3f6d00237aa24b64ee817e778f SystemUUID:ec217e3f-6d00-237a-a24b-64ee817e778f BootID:bf7a2486-082d-40d5-9552-fe1eaa683700 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:10:58:5f:56:17 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:10:58:5f:56:17 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:46:61:fd:cc:09:1f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 14:07:36.784131 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.784131 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 14:07:36.784268 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.784209 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 14:07:36.786804 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.786780 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 14:07:36.786933 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.786806 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-158.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 14:07:36.786974 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.786942 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 14:07:36.786974 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.786950 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 14:07:36.786974 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.786963 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:07:36.787060 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.786975 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:07:36.788368 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.788358 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:07:36.788478 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.788469 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 14:07:36.791037 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.791028 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 17 14:07:36.791072 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.791042 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 14:07:36.791072 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.791054 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 14:07:36.791072 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.791064 2568 kubelet.go:397] "Adding apiserver pod source" Apr 17 14:07:36.791162 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.791082 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 14:07:36.792214 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.792203 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:07:36.792252 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.792221 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:07:36.794299 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.794280 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-27bfh" Apr 17 14:07:36.795394 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.795380 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 14:07:36.797154 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.797140 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 14:07:36.798554 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798541 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 14:07:36.798599 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798559 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 14:07:36.798599 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798566 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 14:07:36.798599 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798571 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 14:07:36.798599 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798577 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 14:07:36.798599 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798582 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 14:07:36.798599 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798588 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 14:07:36.798599 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798594 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 14:07:36.798599 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798600 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 14:07:36.798802 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798606 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 14:07:36.798802 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798614 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 14:07:36.798802 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.798623 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 14:07:36.800517 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.800489 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 14:07:36.800517 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.800500 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 14:07:36.802998 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.802971 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-27bfh" Apr 17 14:07:36.803876 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:36.803823 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 14:07:36.803876 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:36.803826 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 14:07:36.804979 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.804963 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-158.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 14:07:36.807752 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.807734 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 14:07:36.807846 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.807784 2568 server.go:1295] "Started kubelet" Apr 17 14:07:36.807918 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.807870 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 14:07:36.807968 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.807924 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 14:07:36.808001 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.807990 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 14:07:36.808607 ip-10-0-138-158 systemd[1]: Started Kubernetes Kubelet. Apr 17 14:07:36.809222 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.809207 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 14:07:36.809923 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.809910 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 17 14:07:36.815736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.815718 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 14:07:36.816182 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.816169 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 14:07:36.816829 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.816813 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 14:07:36.816829 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.816813 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 14:07:36.816962 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.816838 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 14:07:36.816962 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.816949 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 17 14:07:36.816962 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.816957 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 17 14:07:36.817125 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:36.817057 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:36.817296 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.816972 2568 factory.go:55] Registering systemd factory Apr 17 14:07:36.817296 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.817299 2568 factory.go:223] Registration of the systemd container factory successfully Apr 17 14:07:36.817594 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.817577 2568 factory.go:153] Registering CRI-O factory Apr 17 14:07:36.817690 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.817597 2568 factory.go:223] Registration of the crio container factory successfully Apr 17 14:07:36.817690 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.817655 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 14:07:36.817690 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.817670 2568 factory.go:103] Registering Raw factory Apr 17 14:07:36.817690 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.817685 2568 manager.go:1196] Started watching for new ooms in manager Apr 17 14:07:36.817881 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:36.817761 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 14:07:36.818291 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.818273 2568 manager.go:319] Starting recovery of all containers Apr 17 14:07:36.818393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.818375 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:07:36.821391 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:36.821368 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-158.ec2.internal\" not found" node="ip-10-0-138-158.ec2.internal" Apr 17 14:07:36.828977 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.828836 2568 manager.go:324] Recovery completed Apr 17 14:07:36.833352 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.833332 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:36.835763 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.835748 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:36.835823 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.835774 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:36.835823 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.835784 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:36.836230 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.836217 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 14:07:36.836271 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.836231 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 14:07:36.836271 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.836251 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:07:36.838473 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.838461 2568 policy_none.go:49] "None policy: Start" Apr 17 14:07:36.838571 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.838477 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 14:07:36.838571 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.838487 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 17 14:07:36.883093 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.883076 2568 manager.go:341] "Starting Device Plugin manager" Apr 17 14:07:36.893132 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:36.883105 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 14:07:36.893132 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.883115 2568 server.go:85] "Starting device plugin registration server" Apr 17 14:07:36.893132 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.883356 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 14:07:36.893132 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.883367 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 14:07:36.893132 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.883463 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 14:07:36.893132 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.883572 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 14:07:36.893132 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.883582 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 14:07:36.893132 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:36.884053 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 14:07:36.893132 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:36.884089 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:36.943178 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.943139 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 14:07:36.944386 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.944341 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 14:07:36.944386 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.944365 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 14:07:36.944386 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.944381 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 14:07:36.944581 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.944390 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 14:07:36.944581 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:36.944422 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 14:07:36.948847 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.948830 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:07:36.983715 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.983684 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:36.984642 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.984624 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:36.984723 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.984651 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:36.984723 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.984663 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:36.984723 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.984690 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-158.ec2.internal" Apr 17 14:07:36.995129 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:36.995114 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-158.ec2.internal" Apr 17 14:07:36.995174 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:36.995134 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-158.ec2.internal\": node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:37.010947 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.010929 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:37.044778 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.044760 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-158.ec2.internal"] Apr 17 14:07:37.044835 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.044815 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:37.045539 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.045518 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:37.045630 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.045548 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:37.045630 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.045558 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:37.047786 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.047774 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:37.047931 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.047914 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.047977 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.047944 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:37.048455 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.048442 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:37.048455 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.048451 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:37.048581 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.048465 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:37.048581 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.048471 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:37.048581 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.048480 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:37.048581 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.048482 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:37.050617 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.050603 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.050691 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.050625 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:37.052165 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.052147 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:37.052243 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.052180 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:37.052243 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.052196 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:37.078112 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.078093 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-158.ec2.internal\" not found" node="ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.082426 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.082413 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-158.ec2.internal\" not found" node="ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.111987 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.111969 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:37.117752 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.117733 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/46e6b5815746df0ce219a3af5d4f28a8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal\" (UID: \"46e6b5815746df0ce219a3af5d4f28a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.213088 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.213009 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:37.218318 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.218299 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/46e6b5815746df0ce219a3af5d4f28a8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal\" (UID: \"46e6b5815746df0ce219a3af5d4f28a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.218380 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.218338 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46e6b5815746df0ce219a3af5d4f28a8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal\" (UID: \"46e6b5815746df0ce219a3af5d4f28a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.218380 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.218356 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68c692e25ca4376cc2fb31a74dcd7849-config\") pod \"kube-apiserver-proxy-ip-10-0-138-158.ec2.internal\" (UID: \"68c692e25ca4376cc2fb31a74dcd7849\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.218380 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.218308 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/46e6b5815746df0ce219a3af5d4f28a8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal\" (UID: \"46e6b5815746df0ce219a3af5d4f28a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.313745 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.313707 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:37.319003 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.318986 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46e6b5815746df0ce219a3af5d4f28a8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal\" (UID: \"46e6b5815746df0ce219a3af5d4f28a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.319063 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.319012 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68c692e25ca4376cc2fb31a74dcd7849-config\") pod \"kube-apiserver-proxy-ip-10-0-138-158.ec2.internal\" (UID: \"68c692e25ca4376cc2fb31a74dcd7849\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.319063 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.319034 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68c692e25ca4376cc2fb31a74dcd7849-config\") pod \"kube-apiserver-proxy-ip-10-0-138-158.ec2.internal\" (UID: \"68c692e25ca4376cc2fb31a74dcd7849\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.319139 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.319066 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46e6b5815746df0ce219a3af5d4f28a8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal\" (UID: \"46e6b5815746df0ce219a3af5d4f28a8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.380200 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.380167 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.384786 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.384756 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-158.ec2.internal" Apr 17 14:07:37.414598 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.414563 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:37.515199 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.515115 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:37.615652 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.615619 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:37.716249 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.716223 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:37.716740 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.716263 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 14:07:37.716740 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.716385 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:07:37.716740 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.716413 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:07:37.805096 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.805056 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 14:02:36 +0000 UTC" deadline="2027-12-16 00:19:14.084478413 +0000 UTC" Apr 17 14:07:37.805096 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.805096 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14578h11m36.279386215s" Apr 17 14:07:37.815921 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.815900 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 14:07:37.816331 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.816311 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:37.825282 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.825264 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:07:37.843698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.843674 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bk5h4" Apr 17 14:07:37.849769 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.849753 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bk5h4" Apr 17 14:07:37.903604 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:37.903571 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46e6b5815746df0ce219a3af5d4f28a8.slice/crio-96fb1c9bf30eb16aa41467386693719623e9b895dc4b2353f4778ecbfa00f55c WatchSource:0}: Error finding container 96fb1c9bf30eb16aa41467386693719623e9b895dc4b2353f4778ecbfa00f55c: Status 404 returned error can't find the container with id 96fb1c9bf30eb16aa41467386693719623e9b895dc4b2353f4778ecbfa00f55c Apr 17 14:07:37.904780 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:37.904756 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68c692e25ca4376cc2fb31a74dcd7849.slice/crio-bb5be5aa244ef76a1853116bb6af9981c35c22f5dfc68888f12d446e877b4877 WatchSource:0}: Error finding container bb5be5aa244ef76a1853116bb6af9981c35c22f5dfc68888f12d446e877b4877: Status 404 returned error can't find the container with id bb5be5aa244ef76a1853116bb6af9981c35c22f5dfc68888f12d446e877b4877 Apr 17 14:07:37.909005 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.908993 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:07:37.916968 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:37.916950 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:37.946888 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.946846 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-158.ec2.internal" event={"ID":"68c692e25ca4376cc2fb31a74dcd7849","Type":"ContainerStarted","Data":"bb5be5aa244ef76a1853116bb6af9981c35c22f5dfc68888f12d446e877b4877"} Apr 17 14:07:37.947721 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:37.947704 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" event={"ID":"46e6b5815746df0ce219a3af5d4f28a8","Type":"ContainerStarted","Data":"96fb1c9bf30eb16aa41467386693719623e9b895dc4b2353f4778ecbfa00f55c"} Apr 17 14:07:38.018027 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.018002 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:38.031780 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.031714 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:07:38.118743 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.118720 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:38.219181 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.219155 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:38.320034 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.319969 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-158.ec2.internal\" not found" Apr 17 14:07:38.398216 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.398193 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:07:38.416896 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.416872 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" Apr 17 14:07:38.428702 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.428682 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:07:38.429524 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.429487 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-158.ec2.internal" Apr 17 14:07:38.437971 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.437955 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:07:38.588376 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.588303 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:07:38.792863 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.792833 2568 apiserver.go:52] "Watching apiserver" Apr 17 14:07:38.804038 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.804011 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 14:07:38.804377 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.804354 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-ztcfx","openshift-image-registry/node-ca-mq44v","openshift-multus/multus-additional-cni-plugins-778wr","openshift-multus/multus-cmjrq","openshift-ovn-kubernetes/ovnkube-node-brxr6","kube-system/kube-apiserver-proxy-ip-10-0-138-158.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal","openshift-multus/network-metrics-daemon-f6d89","openshift-network-diagnostics/network-check-target-mcl6c","openshift-network-operator/iptables-alerter-nczp7","kube-system/konnectivity-agent-5mtb9","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8","openshift-cluster-node-tuning-operator/tuned-fmns5"] Apr 17 14:07:38.809680 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.809657 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:38.809787 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.809738 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:07:38.811810 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.811786 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nczp7" Apr 17 14:07:38.814115 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.814087 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.814218 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.814133 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 14:07:38.814522 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.814315 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 14:07:38.814522 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.814415 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gj4w6\"" Apr 17 14:07:38.814522 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.814441 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:07:38.816367 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.816346 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 14:07:38.816686 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.816659 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.816784 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.816757 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 14:07:38.816885 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.816666 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 14:07:38.816980 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.816665 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qqrsd\"" Apr 17 14:07:38.817202 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.817067 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 14:07:38.818632 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.818489 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 14:07:38.818864 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.818845 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wtrxn\"" Apr 17 14:07:38.818957 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.818881 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 14:07:38.819108 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.819088 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.821282 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.821026 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 14:07:38.821282 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.821032 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 14:07:38.821466 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.821439 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:38.821563 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.821544 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:07:38.821625 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.821549 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mq44v" Apr 17 14:07:38.821906 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.821888 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 14:07:38.822120 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.822101 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 14:07:38.822187 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.822143 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 14:07:38.822187 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.822179 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 14:07:38.822436 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.822419 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hm9xv\"" Apr 17 14:07:38.823227 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.823205 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 14:07:38.823405 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.823390 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 14:07:38.823620 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.823605 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-96h9n\"" Apr 17 14:07:38.823691 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.823607 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 14:07:38.823884 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.823865 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ztcfx" Apr 17 14:07:38.825765 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.825747 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 14:07:38.825765 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.825759 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 14:07:38.825906 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.825781 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-dss5c\"" Apr 17 14:07:38.826682 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.826661 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-multus-socket-dir-parent\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.826768 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.826697 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-var-lib-openvswitch\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.826768 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.826722 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-etc-openvswitch\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.826768 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.826744 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-cni-netd\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.826915 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.826767 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dccd1bed-f8d5-4c16-968b-e828fa6150a1-env-overrides\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.826915 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.826790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a2839c1e-60df-4132-aed5-549b23baa1fb-iptables-alerter-script\") pod \"iptables-alerter-nczp7\" (UID: \"a2839c1e-60df-4132-aed5-549b23baa1fb\") " pod="openshift-network-operator/iptables-alerter-nczp7" Apr 17 14:07:38.826915 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.826816 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-os-release\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.826915 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.826857 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-run-k8s-cni-cncf-io\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.826915 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.826895 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-var-lib-cni-bin\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.827371 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/786788f0-7365-4b9c-9628-78838c53bc50-multus-daemon-config\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.827443 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827386 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8x8d\" (UniqueName: \"kubernetes.io/projected/786788f0-7365-4b9c-9628-78838c53bc50-kube-api-access-w8x8d\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.827443 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827405 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-run-systemd\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.827443 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:38.827609 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827442 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-multus-cni-dir\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.827609 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827465 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-run-multus-certs\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.827609 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827481 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-etc-kubernetes\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.827609 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827520 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dccd1bed-f8d5-4c16-968b-e828fa6150a1-ovn-node-metrics-cert\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.827609 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827546 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dccd1bed-f8d5-4c16-968b-e828fa6150a1-ovnkube-script-lib\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.827609 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827598 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fz28\" (UniqueName: \"kubernetes.io/projected/497bbc82-edab-4d97-bcc8-7d428e62da1e-kube-api-access-9fz28\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:38.827882 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827649 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25rc5\" (UniqueName: \"kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5\") pod \"network-check-target-mcl6c\" (UID: \"080d6200-63b7-4e65-8d68-ea319212caed\") " pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:38.827882 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827680 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2839c1e-60df-4132-aed5-549b23baa1fb-host-slash\") pod \"iptables-alerter-nczp7\" (UID: \"a2839c1e-60df-4132-aed5-549b23baa1fb\") " pod="openshift-network-operator/iptables-alerter-nczp7" Apr 17 14:07:38.827882 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827707 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-system-cni-dir\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.827882 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827732 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/79d360cf-60bc-4bbe-ab0a-2832dd974cde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.827882 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827757 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlgcj\" (UniqueName: \"kubernetes.io/projected/79d360cf-60bc-4bbe-ab0a-2832dd974cde-kube-api-access-qlgcj\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.827882 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827806 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/786788f0-7365-4b9c-9628-78838c53bc50-cni-binary-copy\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.827882 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827841 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-var-lib-cni-multus\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.827882 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827869 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-multus-conf-dir\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.828147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827898 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dccd1bed-f8d5-4c16-968b-e828fa6150a1-ovnkube-config\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827924 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5762\" (UniqueName: \"kubernetes.io/projected/a2839c1e-60df-4132-aed5-549b23baa1fb-kube-api-access-f5762\") pod \"iptables-alerter-nczp7\" (UID: \"a2839c1e-60df-4132-aed5-549b23baa1fb\") " pod="openshift-network-operator/iptables-alerter-nczp7" Apr 17 14:07:38.828147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827956 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.828147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.827988 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-cnibin\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.828147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828015 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-run-netns\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.828147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828037 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-var-lib-kubelet\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.828147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828061 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-hostroot\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.828147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828083 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-systemd-units\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828117 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-run-netns\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828142 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828426 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828177 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-os-release\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.828426 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828199 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-slash\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828426 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828222 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-log-socket\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828426 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828244 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/79d360cf-60bc-4bbe-ab0a-2832dd974cde-cni-binary-copy\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.828426 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828271 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-system-cni-dir\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.828426 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828309 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-run-openvswitch\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828426 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828332 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-node-log\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828426 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828355 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-cnibin\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.828426 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828395 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/79d360cf-60bc-4bbe-ab0a-2832dd974cde-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.828822 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828443 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-kubelet\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828822 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828462 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-run-ovn\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828822 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-run-ovn-kubernetes\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828822 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828491 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-cni-bin\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828822 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828531 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvglg\" (UniqueName: \"kubernetes.io/projected/dccd1bed-f8d5-4c16-968b-e828fa6150a1-kube-api-access-fvglg\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.828822 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.828549 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:07:38.829076 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.829030 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:38.832550 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.832104 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 14:07:38.832550 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.832155 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.832550 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.832256 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 14:07:38.832550 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.832299 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-h5nks\"" Apr 17 14:07:38.832550 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.832308 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 14:07:38.832550 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.832262 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7ccvg\"" Apr 17 14:07:38.832550 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.832444 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 14:07:38.834261 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.834243 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 14:07:38.834558 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.834540 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vjnwm\"" Apr 17 14:07:38.834710 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.834691 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:07:38.834794 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.834738 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 14:07:38.850672 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.850644 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:02:37 +0000 UTC" deadline="2027-10-10 06:06:57.026582352 +0000 UTC" Apr 17 14:07:38.850672 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.850671 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12975h59m18.175914731s" Apr 17 14:07:38.918092 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.918069 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 14:07:38.928725 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928694 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.928854 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928737 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-cnibin\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.928854 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928763 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-run-netns\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.928854 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928798 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-var-lib-kubelet\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.928854 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928841 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/54a01f80-7898-48d4-9c07-39869c129452-etc-tuned\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.928854 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928844 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-cnibin\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.929103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928851 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-run-netns\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.929103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928887 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.929103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928893 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-log-socket\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.929103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928889 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-var-lib-kubelet\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.929103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928928 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmjn\" (UniqueName: \"kubernetes.io/projected/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-kube-api-access-xbmjn\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:38.929103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928958 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-systemd\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.929103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.928991 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/79d360cf-60bc-4bbe-ab0a-2832dd974cde-cni-binary-copy\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.929103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929016 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-system-cni-dir\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.929103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929068 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-node-log\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.929103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-log-socket\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.929103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929096 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44d13d23-0ead-4ceb-b841-467a36463db2-tmp-dir\") pod \"node-resolver-ztcfx\" (UID: \"44d13d23-0ead-4ceb-b841-467a36463db2\") " pod="openshift-dns/node-resolver-ztcfx" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-registration-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929146 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-sysctl-d\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929173 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-cnibin\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929197 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-kubelet\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929237 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvglg\" (UniqueName: \"kubernetes.io/projected/dccd1bed-f8d5-4c16-968b-e828fa6150a1-kube-api-access-fvglg\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929253 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-device-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929267 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-sys\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929286 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8pxq\" (UniqueName: \"kubernetes.io/projected/65df7e4a-6219-433f-b614-258be054188a-kube-api-access-r8pxq\") pod \"node-ca-mq44v\" (UID: \"65df7e4a-6219-433f-b614-258be054188a\") " pod="openshift-image-registry/node-ca-mq44v" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929310 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-os-release\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/786788f0-7365-4b9c-9628-78838c53bc50-multus-daemon-config\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929351 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-run-systemd\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929374 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929393 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fz28\" (UniqueName: \"kubernetes.io/projected/497bbc82-edab-4d97-bcc8-7d428e62da1e-kube-api-access-9fz28\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929409 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/44d13d23-0ead-4ceb-b841-467a36463db2-hosts-file\") pod \"node-resolver-ztcfx\" (UID: \"44d13d23-0ead-4ceb-b841-467a36463db2\") " pod="openshift-dns/node-resolver-ztcfx" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929423 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-etc-selinux\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929437 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-lib-modules\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.929644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929452 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25rc5\" (UniqueName: \"kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5\") pod \"network-check-target-mcl6c\" (UID: \"080d6200-63b7-4e65-8d68-ea319212caed\") " pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929487 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5762\" (UniqueName: \"kubernetes.io/projected/a2839c1e-60df-4132-aed5-549b23baa1fb-kube-api-access-f5762\") pod \"iptables-alerter-nczp7\" (UID: \"a2839c1e-60df-4132-aed5-549b23baa1fb\") " pod="openshift-network-operator/iptables-alerter-nczp7" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929518 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc8bm\" (UniqueName: \"kubernetes.io/projected/54a01f80-7898-48d4-9c07-39869c129452-kube-api-access-tc8bm\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/79d360cf-60bc-4bbe-ab0a-2832dd974cde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929584 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-multus-conf-dir\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929606 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dccd1bed-f8d5-4c16-968b-e828fa6150a1-ovnkube-config\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929625 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djskr\" (UniqueName: \"kubernetes.io/projected/44d13d23-0ead-4ceb-b841-467a36463db2-kube-api-access-djskr\") pod \"node-resolver-ztcfx\" (UID: \"44d13d23-0ead-4ceb-b841-467a36463db2\") " pod="openshift-dns/node-resolver-ztcfx" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929640 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-hostroot\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929638 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/79d360cf-60bc-4bbe-ab0a-2832dd974cde-cni-binary-copy\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-systemd-units\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-run-netns\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929706 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929723 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/65df7e4a-6219-433f-b614-258be054188a-serviceca\") pod \"node-ca-mq44v\" (UID: \"65df7e4a-6219-433f-b614-258be054188a\") " pod="openshift-image-registry/node-ca-mq44v" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929742 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cb936c0a-ae1f-4ae8-825e-afab50630fa3-konnectivity-ca\") pod \"konnectivity-agent-5mtb9\" (UID: \"cb936c0a-ae1f-4ae8-825e-afab50630fa3\") " pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929774 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-sysconfig\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929794 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-os-release\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.930410 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.929800 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929809 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-slash\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929823 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54a01f80-7898-48d4-9c07-39869c129452-tmp\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929843 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-run-openvswitch\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.929862 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs podName:497bbc82-edab-4d97-bcc8-7d428e62da1e nodeName:}" failed. No retries permitted until 2026-04-17 14:07:39.429839559 +0000 UTC m=+3.066416948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs") pod "network-metrics-daemon-f6d89" (UID: "497bbc82-edab-4d97-bcc8-7d428e62da1e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929879 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-run-openvswitch\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929893 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-socket-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929920 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-sys-fs\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929946 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cb936c0a-ae1f-4ae8-825e-afab50630fa3-agent-certs\") pod \"konnectivity-agent-5mtb9\" (UID: \"cb936c0a-ae1f-4ae8-825e-afab50630fa3\") " pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.929971 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-modprobe-d\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930004 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/79d360cf-60bc-4bbe-ab0a-2832dd974cde-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-run-ovn\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930059 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-run-ovn-kubernetes\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-cni-bin\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930208 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-cni-bin\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930222 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-os-release\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.931265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930400 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a2839c1e-60df-4132-aed5-549b23baa1fb-iptables-alerter-script\") pod \"iptables-alerter-nczp7\" (UID: \"a2839c1e-60df-4132-aed5-549b23baa1fb\") " pod="openshift-network-operator/iptables-alerter-nczp7" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930432 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-host\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930461 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-multus-socket-dir-parent\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930451 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-multus-conf-dir\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930492 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-var-lib-openvswitch\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930536 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-etc-openvswitch\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930563 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-cni-netd\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930574 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/79d360cf-60bc-4bbe-ab0a-2832dd974cde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930591 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dccd1bed-f8d5-4c16-968b-e828fa6150a1-env-overrides\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930617 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-kubernetes\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930624 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-os-release\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930642 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-run\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930649 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-slash\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930671 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-var-lib-kubelet\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930694 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-cnibin\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930706 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-run-k8s-cni-cncf-io\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930721 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-kubelet\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.932025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930733 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-var-lib-cni-bin\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930761 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8x8d\" (UniqueName: \"kubernetes.io/projected/786788f0-7365-4b9c-9628-78838c53bc50-kube-api-access-w8x8d\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930788 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-multus-cni-dir\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930814 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-run-multus-certs\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930845 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-etc-kubernetes\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930873 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dccd1bed-f8d5-4c16-968b-e828fa6150a1-ovn-node-metrics-cert\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930880 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-run-ovn\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930918 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dccd1bed-f8d5-4c16-968b-e828fa6150a1-ovnkube-script-lib\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930952 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2839c1e-60df-4132-aed5-549b23baa1fb-host-slash\") pod \"iptables-alerter-nczp7\" (UID: \"a2839c1e-60df-4132-aed5-549b23baa1fb\") " pod="openshift-network-operator/iptables-alerter-nczp7" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930970 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-sysctl-conf\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930980 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dccd1bed-f8d5-4c16-968b-e828fa6150a1-ovnkube-config\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.930987 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-system-cni-dir\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.931007 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlgcj\" (UniqueName: \"kubernetes.io/projected/79d360cf-60bc-4bbe-ab0a-2832dd974cde-kube-api-access-qlgcj\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.931026 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/786788f0-7365-4b9c-9628-78838c53bc50-cni-binary-copy\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.931042 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-var-lib-cni-multus\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.931062 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65df7e4a-6219-433f-b614-258be054188a-host\") pod \"node-ca-mq44v\" (UID: \"65df7e4a-6219-433f-b614-258be054188a\") " pod="openshift-image-registry/node-ca-mq44v" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.931184 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.931481 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/786788f0-7365-4b9c-9628-78838c53bc50-multus-daemon-config\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.932798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.931575 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dccd1bed-f8d5-4c16-968b-e828fa6150a1-ovnkube-script-lib\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.931643 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2839c1e-60df-4132-aed5-549b23baa1fb-host-slash\") pod \"iptables-alerter-nczp7\" (UID: \"a2839c1e-60df-4132-aed5-549b23baa1fb\") " pod="openshift-network-operator/iptables-alerter-nczp7" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.931693 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/79d360cf-60bc-4bbe-ab0a-2832dd974cde-system-cni-dir\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932367 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/786788f0-7365-4b9c-9628-78838c53bc50-cni-binary-copy\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.931025 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-run-systemd\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932435 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-var-lib-cni-multus\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932441 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a2839c1e-60df-4132-aed5-549b23baa1fb-iptables-alerter-script\") pod \"iptables-alerter-nczp7\" (UID: \"a2839c1e-60df-4132-aed5-549b23baa1fb\") " pod="openshift-network-operator/iptables-alerter-nczp7" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932468 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-hostroot\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932523 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-systemd-units\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932531 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-cni-netd\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932568 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-system-cni-dir\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932596 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-etc-openvswitch\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932627 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-node-log\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932659 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-run-netns\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932691 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932753 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-multus-socket-dir-parent\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932784 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-var-lib-openvswitch\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932859 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dccd1bed-f8d5-4c16-968b-e828fa6150a1-env-overrides\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.933641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932971 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-run-multus-certs\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.934306 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.932996 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dccd1bed-f8d5-4c16-968b-e828fa6150a1-host-run-ovn-kubernetes\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.934306 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.933007 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-etc-kubernetes\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.934306 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.933021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-var-lib-cni-bin\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.934306 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.933048 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-multus-cni-dir\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.934306 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.933069 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/786788f0-7365-4b9c-9628-78838c53bc50-host-run-k8s-cni-cncf-io\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.934306 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.933254 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/79d360cf-60bc-4bbe-ab0a-2832dd974cde-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.934747 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.934726 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dccd1bed-f8d5-4c16-968b-e828fa6150a1-ovn-node-metrics-cert\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.939063 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.939043 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:38.939063 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.939065 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:38.939281 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.939078 2568 projected.go:194] Error preparing data for projected volume kube-api-access-25rc5 for pod openshift-network-diagnostics/network-check-target-mcl6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:38.939281 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:38.939137 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5 podName:080d6200-63b7-4e65-8d68-ea319212caed nodeName:}" failed. No retries permitted until 2026-04-17 14:07:39.43911622 +0000 UTC m=+3.075693593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-25rc5" (UniqueName: "kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5") pod "network-check-target-mcl6c" (UID: "080d6200-63b7-4e65-8d68-ea319212caed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:38.948583 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.941607 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8x8d\" (UniqueName: \"kubernetes.io/projected/786788f0-7365-4b9c-9628-78838c53bc50-kube-api-access-w8x8d\") pod \"multus-cmjrq\" (UID: \"786788f0-7365-4b9c-9628-78838c53bc50\") " pod="openshift-multus/multus-cmjrq" Apr 17 14:07:38.948583 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.941609 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5762\" (UniqueName: \"kubernetes.io/projected/a2839c1e-60df-4132-aed5-549b23baa1fb-kube-api-access-f5762\") pod \"iptables-alerter-nczp7\" (UID: \"a2839c1e-60df-4132-aed5-549b23baa1fb\") " pod="openshift-network-operator/iptables-alerter-nczp7" Apr 17 14:07:38.948583 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.944475 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvglg\" (UniqueName: \"kubernetes.io/projected/dccd1bed-f8d5-4c16-968b-e828fa6150a1-kube-api-access-fvglg\") pod \"ovnkube-node-brxr6\" (UID: \"dccd1bed-f8d5-4c16-968b-e828fa6150a1\") " pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:38.948583 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.944495 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlgcj\" (UniqueName: \"kubernetes.io/projected/79d360cf-60bc-4bbe-ab0a-2832dd974cde-kube-api-access-qlgcj\") pod \"multus-additional-cni-plugins-778wr\" (UID: \"79d360cf-60bc-4bbe-ab0a-2832dd974cde\") " pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:38.948583 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:38.948555 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fz28\" (UniqueName: \"kubernetes.io/projected/497bbc82-edab-4d97-bcc8-7d428e62da1e-kube-api-access-9fz28\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:39.031860 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.031830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-device-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.032025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.031869 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-sys\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.031894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8pxq\" (UniqueName: \"kubernetes.io/projected/65df7e4a-6219-433f-b614-258be054188a-kube-api-access-r8pxq\") pod \"node-ca-mq44v\" (UID: \"65df7e4a-6219-433f-b614-258be054188a\") " pod="openshift-image-registry/node-ca-mq44v" Apr 17 14:07:39.032025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.031933 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/44d13d23-0ead-4ceb-b841-467a36463db2-hosts-file\") pod \"node-resolver-ztcfx\" (UID: \"44d13d23-0ead-4ceb-b841-467a36463db2\") " pod="openshift-dns/node-resolver-ztcfx" Apr 17 14:07:39.032025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.031954 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-etc-selinux\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.032025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.031953 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-device-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.032025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.031959 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-sys\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.031975 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-lib-modules\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032017 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/44d13d23-0ead-4ceb-b841-467a36463db2-hosts-file\") pod \"node-resolver-ztcfx\" (UID: \"44d13d23-0ead-4ceb-b841-467a36463db2\") " pod="openshift-dns/node-resolver-ztcfx" Apr 17 14:07:39.032025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032025 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc8bm\" (UniqueName: \"kubernetes.io/projected/54a01f80-7898-48d4-9c07-39869c129452-kube-api-access-tc8bm\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032048 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djskr\" (UniqueName: \"kubernetes.io/projected/44d13d23-0ead-4ceb-b841-467a36463db2-kube-api-access-djskr\") pod \"node-resolver-ztcfx\" (UID: \"44d13d23-0ead-4ceb-b841-467a36463db2\") " pod="openshift-dns/node-resolver-ztcfx" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032074 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/65df7e4a-6219-433f-b614-258be054188a-serviceca\") pod \"node-ca-mq44v\" (UID: \"65df7e4a-6219-433f-b614-258be054188a\") " pod="openshift-image-registry/node-ca-mq44v" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032091 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032093 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-lib-modules\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cb936c0a-ae1f-4ae8-825e-afab50630fa3-konnectivity-ca\") pod \"konnectivity-agent-5mtb9\" (UID: \"cb936c0a-ae1f-4ae8-825e-afab50630fa3\") " pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032138 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-sysconfig\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54a01f80-7898-48d4-9c07-39869c129452-tmp\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032185 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-socket-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032200 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-sys-fs\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032214 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cb936c0a-ae1f-4ae8-825e-afab50630fa3-agent-certs\") pod \"konnectivity-agent-5mtb9\" (UID: \"cb936c0a-ae1f-4ae8-825e-afab50630fa3\") " pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032218 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032233 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-modprobe-d\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-host\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032272 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-kubernetes\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032290 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-run\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032307 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-var-lib-kubelet\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.032438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032342 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-sysctl-conf\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032367 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-socket-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65df7e4a-6219-433f-b614-258be054188a-host\") pod \"node-ca-mq44v\" (UID: \"65df7e4a-6219-433f-b614-258be054188a\") " pod="openshift-image-registry/node-ca-mq44v" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032410 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65df7e4a-6219-433f-b614-258be054188a-host\") pod \"node-ca-mq44v\" (UID: \"65df7e4a-6219-433f-b614-258be054188a\") " pod="openshift-image-registry/node-ca-mq44v" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032419 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/54a01f80-7898-48d4-9c07-39869c129452-etc-tuned\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032448 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmjn\" (UniqueName: \"kubernetes.io/projected/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-kube-api-access-xbmjn\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032474 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-systemd\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032499 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-modprobe-d\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44d13d23-0ead-4ceb-b841-467a36463db2-tmp-dir\") pod \"node-resolver-ztcfx\" (UID: \"44d13d23-0ead-4ceb-b841-467a36463db2\") " pod="openshift-dns/node-resolver-ztcfx" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032072 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-etc-selinux\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032548 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-registration-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032571 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-sysctl-d\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032579 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-kubernetes\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032607 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/65df7e4a-6219-433f-b614-258be054188a-serviceca\") pod \"node-ca-mq44v\" (UID: \"65df7e4a-6219-433f-b614-258be054188a\") " pod="openshift-image-registry/node-ca-mq44v" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032645 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-sysconfig\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032671 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-systemd\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032734 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-sys-fs\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032734 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-registration-dir\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.033245 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-var-lib-kubelet\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.034223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032926 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44d13d23-0ead-4ceb-b841-467a36463db2-tmp-dir\") pod \"node-resolver-ztcfx\" (UID: \"44d13d23-0ead-4ceb-b841-467a36463db2\") " pod="openshift-dns/node-resolver-ztcfx" Apr 17 14:07:39.034223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.032935 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-run\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.034223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.033002 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-host\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.034223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.033011 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cb936c0a-ae1f-4ae8-825e-afab50630fa3-konnectivity-ca\") pod \"konnectivity-agent-5mtb9\" (UID: \"cb936c0a-ae1f-4ae8-825e-afab50630fa3\") " pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:07:39.034223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.033038 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-sysctl-d\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.034223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.033085 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/54a01f80-7898-48d4-9c07-39869c129452-etc-sysctl-conf\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.035057 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.035030 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/54a01f80-7898-48d4-9c07-39869c129452-etc-tuned\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.035175 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.035090 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54a01f80-7898-48d4-9c07-39869c129452-tmp\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.035175 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.035162 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cb936c0a-ae1f-4ae8-825e-afab50630fa3-agent-certs\") pod \"konnectivity-agent-5mtb9\" (UID: \"cb936c0a-ae1f-4ae8-825e-afab50630fa3\") " pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:07:39.040236 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.040212 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8pxq\" (UniqueName: \"kubernetes.io/projected/65df7e4a-6219-433f-b614-258be054188a-kube-api-access-r8pxq\") pod \"node-ca-mq44v\" (UID: \"65df7e4a-6219-433f-b614-258be054188a\") " pod="openshift-image-registry/node-ca-mq44v" Apr 17 14:07:39.040325 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.040242 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmjn\" (UniqueName: \"kubernetes.io/projected/fa5741d3-e0ce-42fe-9791-6fce2bd6da17-kube-api-access-xbmjn\") pod \"aws-ebs-csi-driver-node-gjdx8\" (UID: \"fa5741d3-e0ce-42fe-9791-6fce2bd6da17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.040387 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.040372 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc8bm\" (UniqueName: \"kubernetes.io/projected/54a01f80-7898-48d4-9c07-39869c129452-kube-api-access-tc8bm\") pod \"tuned-fmns5\" (UID: \"54a01f80-7898-48d4-9c07-39869c129452\") " pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.040752 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.040731 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djskr\" (UniqueName: \"kubernetes.io/projected/44d13d23-0ead-4ceb-b841-467a36463db2-kube-api-access-djskr\") pod \"node-resolver-ztcfx\" (UID: \"44d13d23-0ead-4ceb-b841-467a36463db2\") " pod="openshift-dns/node-resolver-ztcfx" Apr 17 14:07:39.124883 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.124800 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nczp7" Apr 17 14:07:39.136662 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.136641 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmjrq" Apr 17 14:07:39.144272 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.144253 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-778wr" Apr 17 14:07:39.150900 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.150880 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:07:39.157436 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.157406 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mq44v" Apr 17 14:07:39.164998 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.164976 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ztcfx" Apr 17 14:07:39.172588 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.172548 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:07:39.179185 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.179163 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" Apr 17 14:07:39.184794 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.184769 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fmns5" Apr 17 14:07:39.212406 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.212386 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:07:39.435201 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.435107 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:39.435356 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:39.435257 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:39.435356 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:39.435329 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs podName:497bbc82-edab-4d97-bcc8-7d428e62da1e nodeName:}" failed. No retries permitted until 2026-04-17 14:07:40.435309804 +0000 UTC m=+4.071887185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs") pod "network-metrics-daemon-f6d89" (UID: "497bbc82-edab-4d97-bcc8-7d428e62da1e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:39.536077 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.536042 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25rc5\" (UniqueName: \"kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5\") pod \"network-check-target-mcl6c\" (UID: \"080d6200-63b7-4e65-8d68-ea319212caed\") " pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:39.536254 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:39.536224 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:39.536324 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:39.536254 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:39.536324 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:39.536271 2568 projected.go:194] Error preparing data for projected volume kube-api-access-25rc5 for pod openshift-network-diagnostics/network-check-target-mcl6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:39.536390 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:39.536326 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5 podName:080d6200-63b7-4e65-8d68-ea319212caed nodeName:}" failed. No retries permitted until 2026-04-17 14:07:40.536311829 +0000 UTC m=+4.172889197 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-25rc5" (UniqueName: "kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5") pod "network-check-target-mcl6c" (UID: "080d6200-63b7-4e65-8d68-ea319212caed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:39.720779 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:39.720750 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79d360cf_60bc_4bbe_ab0a_2832dd974cde.slice/crio-219ac65cccd85bc3a072da9b7b866881431e010b703a3288e6bbde49b7334705 WatchSource:0}: Error finding container 219ac65cccd85bc3a072da9b7b866881431e010b703a3288e6bbde49b7334705: Status 404 returned error can't find the container with id 219ac65cccd85bc3a072da9b7b866881431e010b703a3288e6bbde49b7334705 Apr 17 14:07:39.721945 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:39.721416 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2839c1e_60df_4132_aed5_549b23baa1fb.slice/crio-93ac73281e07f5ce4c28b9a9530370be226b96c5126024b85ff2de668a265032 WatchSource:0}: Error finding container 93ac73281e07f5ce4c28b9a9530370be226b96c5126024b85ff2de668a265032: Status 404 returned error can't find the container with id 93ac73281e07f5ce4c28b9a9530370be226b96c5126024b85ff2de668a265032 Apr 17 14:07:39.723020 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:39.722995 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb936c0a_ae1f_4ae8_825e_afab50630fa3.slice/crio-3421290bebe3f5e2923de512932c61fd40e656010de83a0490b9c00beceaa5b2 WatchSource:0}: Error finding container 3421290bebe3f5e2923de512932c61fd40e656010de83a0490b9c00beceaa5b2: Status 404 returned error can't find the container with id 3421290bebe3f5e2923de512932c61fd40e656010de83a0490b9c00beceaa5b2 Apr 17 14:07:39.723883 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:39.723849 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d13d23_0ead_4ceb_b841_467a36463db2.slice/crio-668d9139087ee4ee6de6f26a54201f8230a76b51880d2cd3dc3c89856f118cdb WatchSource:0}: Error finding container 668d9139087ee4ee6de6f26a54201f8230a76b51880d2cd3dc3c89856f118cdb: Status 404 returned error can't find the container with id 668d9139087ee4ee6de6f26a54201f8230a76b51880d2cd3dc3c89856f118cdb Apr 17 14:07:39.724699 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:39.724665 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddccd1bed_f8d5_4c16_968b_e828fa6150a1.slice/crio-cb731c13ffe16cfd583f6cd6c32d042816de5d0b0f5c9f13056f204ac502c837 WatchSource:0}: Error finding container cb731c13ffe16cfd583f6cd6c32d042816de5d0b0f5c9f13056f204ac502c837: Status 404 returned error can't find the container with id cb731c13ffe16cfd583f6cd6c32d042816de5d0b0f5c9f13056f204ac502c837 Apr 17 14:07:39.726930 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:39.726802 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54a01f80_7898_48d4_9c07_39869c129452.slice/crio-c51275ef2fcd78c1f633da3f18f775f6b2cc6a11f08ad89dc305b49ce42c8eda WatchSource:0}: Error finding container c51275ef2fcd78c1f633da3f18f775f6b2cc6a11f08ad89dc305b49ce42c8eda: Status 404 returned error can't find the container with id c51275ef2fcd78c1f633da3f18f775f6b2cc6a11f08ad89dc305b49ce42c8eda Apr 17 14:07:39.748547 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:39.748523 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65df7e4a_6219_433f_b614_258be054188a.slice/crio-a1276d2d5eca25b2bb6c9e9f2ab208ac3a2f84084af79fe512040989d93e70c5 WatchSource:0}: Error finding container a1276d2d5eca25b2bb6c9e9f2ab208ac3a2f84084af79fe512040989d93e70c5: Status 404 returned error can't find the container with id a1276d2d5eca25b2bb6c9e9f2ab208ac3a2f84084af79fe512040989d93e70c5 Apr 17 14:07:39.749490 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:39.749441 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa5741d3_e0ce_42fe_9791_6fce2bd6da17.slice/crio-1099d301926a4d2b3e5f164b6ddf2e6dd3114c1579294baaae2a316b2d7d0a24 WatchSource:0}: Error finding container 1099d301926a4d2b3e5f164b6ddf2e6dd3114c1579294baaae2a316b2d7d0a24: Status 404 returned error can't find the container with id 1099d301926a4d2b3e5f164b6ddf2e6dd3114c1579294baaae2a316b2d7d0a24 Apr 17 14:07:39.750427 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:07:39.750408 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod786788f0_7365_4b9c_9628_78838c53bc50.slice/crio-399603f14d286d530d12fb30d5c6dbb76ff02c1ce144fe9632dccae97d96ff94 WatchSource:0}: Error finding container 399603f14d286d530d12fb30d5c6dbb76ff02c1ce144fe9632dccae97d96ff94: Status 404 returned error can't find the container with id 399603f14d286d530d12fb30d5c6dbb76ff02c1ce144fe9632dccae97d96ff94 Apr 17 14:07:39.851568 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.851410 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:02:37 +0000 UTC" deadline="2027-11-22 04:05:31.964719109 +0000 UTC" Apr 17 14:07:39.851568 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.851563 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14005h57m52.113159738s" Apr 17 14:07:39.951706 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.951671 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmjrq" event={"ID":"786788f0-7365-4b9c-9628-78838c53bc50","Type":"ContainerStarted","Data":"399603f14d286d530d12fb30d5c6dbb76ff02c1ce144fe9632dccae97d96ff94"} Apr 17 14:07:39.952549 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.952525 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mq44v" event={"ID":"65df7e4a-6219-433f-b614-258be054188a","Type":"ContainerStarted","Data":"a1276d2d5eca25b2bb6c9e9f2ab208ac3a2f84084af79fe512040989d93e70c5"} Apr 17 14:07:39.953381 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.953356 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nczp7" event={"ID":"a2839c1e-60df-4132-aed5-549b23baa1fb","Type":"ContainerStarted","Data":"93ac73281e07f5ce4c28b9a9530370be226b96c5126024b85ff2de668a265032"} Apr 17 14:07:39.956024 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.956000 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-158.ec2.internal" event={"ID":"68c692e25ca4376cc2fb31a74dcd7849","Type":"ContainerStarted","Data":"4808062dddbe918198bc9f8d1b9395c19a12c298c55a7690205e678fb76b75bc"} Apr 17 14:07:39.957056 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.957039 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" event={"ID":"fa5741d3-e0ce-42fe-9791-6fce2bd6da17","Type":"ContainerStarted","Data":"1099d301926a4d2b3e5f164b6ddf2e6dd3114c1579294baaae2a316b2d7d0a24"} Apr 17 14:07:39.958033 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.957969 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fmns5" event={"ID":"54a01f80-7898-48d4-9c07-39869c129452","Type":"ContainerStarted","Data":"c51275ef2fcd78c1f633da3f18f775f6b2cc6a11f08ad89dc305b49ce42c8eda"} Apr 17 14:07:39.958918 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.958889 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" event={"ID":"dccd1bed-f8d5-4c16-968b-e828fa6150a1","Type":"ContainerStarted","Data":"cb731c13ffe16cfd583f6cd6c32d042816de5d0b0f5c9f13056f204ac502c837"} Apr 17 14:07:39.960353 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.960337 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ztcfx" event={"ID":"44d13d23-0ead-4ceb-b841-467a36463db2","Type":"ContainerStarted","Data":"668d9139087ee4ee6de6f26a54201f8230a76b51880d2cd3dc3c89856f118cdb"} Apr 17 14:07:39.961631 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.961608 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5mtb9" event={"ID":"cb936c0a-ae1f-4ae8-825e-afab50630fa3","Type":"ContainerStarted","Data":"3421290bebe3f5e2923de512932c61fd40e656010de83a0490b9c00beceaa5b2"} Apr 17 14:07:39.963304 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.963284 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-778wr" event={"ID":"79d360cf-60bc-4bbe-ab0a-2832dd974cde","Type":"ContainerStarted","Data":"219ac65cccd85bc3a072da9b7b866881431e010b703a3288e6bbde49b7334705"} Apr 17 14:07:39.969211 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:39.969166 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-158.ec2.internal" podStartSLOduration=1.969155775 podStartE2EDuration="1.969155775s" podCreationTimestamp="2026-04-17 14:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:07:39.967948495 +0000 UTC m=+3.604525896" watchObservedRunningTime="2026-04-17 14:07:39.969155775 +0000 UTC m=+3.605733216" Apr 17 14:07:40.445694 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:40.445655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:40.445812 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:40.445782 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:40.445870 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:40.445856 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs podName:497bbc82-edab-4d97-bcc8-7d428e62da1e nodeName:}" failed. No retries permitted until 2026-04-17 14:07:42.445834765 +0000 UTC m=+6.082412142 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs") pod "network-metrics-daemon-f6d89" (UID: "497bbc82-edab-4d97-bcc8-7d428e62da1e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:40.546802 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:40.546766 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25rc5\" (UniqueName: \"kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5\") pod \"network-check-target-mcl6c\" (UID: \"080d6200-63b7-4e65-8d68-ea319212caed\") " pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:40.547030 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:40.546992 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:40.547030 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:40.547015 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:40.547030 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:40.547028 2568 projected.go:194] Error preparing data for projected volume kube-api-access-25rc5 for pod openshift-network-diagnostics/network-check-target-mcl6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:40.547190 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:40.547079 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5 podName:080d6200-63b7-4e65-8d68-ea319212caed nodeName:}" failed. No retries permitted until 2026-04-17 14:07:42.547061581 +0000 UTC m=+6.183638957 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-25rc5" (UniqueName: "kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5") pod "network-check-target-mcl6c" (UID: "080d6200-63b7-4e65-8d68-ea319212caed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:40.945869 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:40.945690 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:40.945869 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:40.945739 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:40.945869 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:40.945815 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:07:40.946392 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:40.946254 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:07:40.994196 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:40.993364 2568 generic.go:358] "Generic (PLEG): container finished" podID="46e6b5815746df0ce219a3af5d4f28a8" containerID="87bb2da3e5c1163cc28eec58f96bbe7589fbe7768392f21e6b866760c31a34bf" exitCode=0 Apr 17 14:07:40.994196 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:40.993539 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" event={"ID":"46e6b5815746df0ce219a3af5d4f28a8","Type":"ContainerDied","Data":"87bb2da3e5c1163cc28eec58f96bbe7589fbe7768392f21e6b866760c31a34bf"} Apr 17 14:07:42.016456 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:42.016414 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" event={"ID":"46e6b5815746df0ce219a3af5d4f28a8","Type":"ContainerStarted","Data":"112fe5fa3100aef56911bc4a5dfe50b87ea047dc80b0fd7ea61a4e5c845e5b6b"} Apr 17 14:07:42.461523 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:42.461409 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:42.461706 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:42.461592 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:42.461706 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:42.461665 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs podName:497bbc82-edab-4d97-bcc8-7d428e62da1e nodeName:}" failed. No retries permitted until 2026-04-17 14:07:46.461646282 +0000 UTC m=+10.098223662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs") pod "network-metrics-daemon-f6d89" (UID: "497bbc82-edab-4d97-bcc8-7d428e62da1e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:42.562154 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:42.562116 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25rc5\" (UniqueName: \"kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5\") pod \"network-check-target-mcl6c\" (UID: \"080d6200-63b7-4e65-8d68-ea319212caed\") " pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:42.562332 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:42.562290 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:42.562332 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:42.562309 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:42.562332 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:42.562321 2568 projected.go:194] Error preparing data for projected volume kube-api-access-25rc5 for pod openshift-network-diagnostics/network-check-target-mcl6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:42.562488 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:42.562381 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5 podName:080d6200-63b7-4e65-8d68-ea319212caed nodeName:}" failed. No retries permitted until 2026-04-17 14:07:46.562361095 +0000 UTC m=+10.198938468 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-25rc5" (UniqueName: "kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5") pod "network-check-target-mcl6c" (UID: "080d6200-63b7-4e65-8d68-ea319212caed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:42.945712 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:42.945633 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:42.945880 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:42.945769 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:07:42.946519 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:42.946320 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:42.946519 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:42.946444 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:07:44.947422 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:44.946626 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:44.947422 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:44.946748 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:07:44.947422 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:44.947273 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:44.947422 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:44.947381 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:07:46.495765 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:46.495729 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:46.496290 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:46.495912 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:46.496290 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:46.495992 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs podName:497bbc82-edab-4d97-bcc8-7d428e62da1e nodeName:}" failed. No retries permitted until 2026-04-17 14:07:54.495972052 +0000 UTC m=+18.132549425 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs") pod "network-metrics-daemon-f6d89" (UID: "497bbc82-edab-4d97-bcc8-7d428e62da1e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:46.597096 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:46.597059 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25rc5\" (UniqueName: \"kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5\") pod \"network-check-target-mcl6c\" (UID: \"080d6200-63b7-4e65-8d68-ea319212caed\") " pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:46.597257 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:46.597232 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:46.597328 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:46.597258 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:46.597328 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:46.597274 2568 projected.go:194] Error preparing data for projected volume kube-api-access-25rc5 for pod openshift-network-diagnostics/network-check-target-mcl6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:46.597427 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:46.597329 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5 podName:080d6200-63b7-4e65-8d68-ea319212caed nodeName:}" failed. No retries permitted until 2026-04-17 14:07:54.597314532 +0000 UTC m=+18.233891899 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-25rc5" (UniqueName: "kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5") pod "network-check-target-mcl6c" (UID: "080d6200-63b7-4e65-8d68-ea319212caed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:46.946700 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:46.946621 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:46.946852 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:46.946739 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:07:46.946852 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:46.946799 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:46.946969 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:46.946882 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:07:48.945604 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:48.945564 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:48.946038 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:48.945704 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:07:48.946038 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:48.945767 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:48.946038 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:48.945884 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:07:50.945655 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:50.945616 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:50.946011 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:50.945734 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:07:50.946069 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:50.946031 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:50.946152 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:50.946130 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:07:52.947416 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:52.947384 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:52.947865 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:52.947393 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:52.947865 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:52.947521 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:07:52.947865 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:52.947626 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:07:54.563250 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:54.563202 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:54.563683 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:54.563318 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:54.563683 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:54.563391 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs podName:497bbc82-edab-4d97-bcc8-7d428e62da1e nodeName:}" failed. No retries permitted until 2026-04-17 14:08:10.563376045 +0000 UTC m=+34.199953413 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs") pod "network-metrics-daemon-f6d89" (UID: "497bbc82-edab-4d97-bcc8-7d428e62da1e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:54.664167 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:54.664132 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25rc5\" (UniqueName: \"kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5\") pod \"network-check-target-mcl6c\" (UID: \"080d6200-63b7-4e65-8d68-ea319212caed\") " pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:54.664343 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:54.664322 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:54.664399 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:54.664351 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:54.664399 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:54.664366 2568 projected.go:194] Error preparing data for projected volume kube-api-access-25rc5 for pod openshift-network-diagnostics/network-check-target-mcl6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:54.664466 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:54.664429 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5 podName:080d6200-63b7-4e65-8d68-ea319212caed nodeName:}" failed. No retries permitted until 2026-04-17 14:08:10.664409754 +0000 UTC m=+34.300987131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-25rc5" (UniqueName: "kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5") pod "network-check-target-mcl6c" (UID: "080d6200-63b7-4e65-8d68-ea319212caed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:54.945286 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:54.945213 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:54.945429 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:54.945315 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:07:54.945429 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:54.945370 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:54.945526 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:54.945454 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:07:56.945824 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:56.945413 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:56.946581 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:56.945417 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:56.946581 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:56.945934 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:07:56.946581 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:56.945966 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:07:57.040948 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.040917 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" event={"ID":"fa5741d3-e0ce-42fe-9791-6fce2bd6da17","Type":"ContainerStarted","Data":"f6ab7ba3cb14d60e25e1dad3f786d445b6c64f38c7e9f6b7016ffad1f7de0303"} Apr 17 14:07:57.042380 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.042347 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fmns5" event={"ID":"54a01f80-7898-48d4-9c07-39869c129452","Type":"ContainerStarted","Data":"8f1cfa76d3bcffeade4ecc926935f61d7cf31cd38454351a2aeade13d41fbccd"} Apr 17 14:07:57.044843 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.044817 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" event={"ID":"dccd1bed-f8d5-4c16-968b-e828fa6150a1","Type":"ContainerStarted","Data":"3dc19ee3a0bc14f90bda7a6a6471cbb506fb83c95ab32a5d099d190de3d3cffe"} Apr 17 14:07:57.044914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.044853 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" event={"ID":"dccd1bed-f8d5-4c16-968b-e828fa6150a1","Type":"ContainerStarted","Data":"588dac9fbd9e3507cb8c9aee18a0c95a06f518963490c3e7acb7ced2b3da8319"} Apr 17 14:07:57.044914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.044867 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" event={"ID":"dccd1bed-f8d5-4c16-968b-e828fa6150a1","Type":"ContainerStarted","Data":"cd0ae163421a4de66a833d27a71b168877ee06489070c637abff80ca3511a08a"} Apr 17 14:07:57.044914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.044881 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" event={"ID":"dccd1bed-f8d5-4c16-968b-e828fa6150a1","Type":"ContainerStarted","Data":"8e701c3d81f2206e9ebd0c717ef0f708b6fd6ec9052afac57dfb159e45dce1a0"} Apr 17 14:07:57.046129 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.046105 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ztcfx" event={"ID":"44d13d23-0ead-4ceb-b841-467a36463db2","Type":"ContainerStarted","Data":"618e19b071fbac0e250a4d853d4f7c501b9b7f40b09c10dfc07bd50c7757f8b6"} Apr 17 14:07:57.047484 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.047465 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5mtb9" event={"ID":"cb936c0a-ae1f-4ae8-825e-afab50630fa3","Type":"ContainerStarted","Data":"557e8bb8dab0a8c6b2730297b35e1bf51a92e0c85b27fe37016ca86213b1ca69"} Apr 17 14:07:57.048927 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.048905 2568 generic.go:358] "Generic (PLEG): container finished" podID="79d360cf-60bc-4bbe-ab0a-2832dd974cde" containerID="85a3889a68d05164852c841e1caaa22c5fbf5dddc9dc321ca4a77edf54239f74" exitCode=0 Apr 17 14:07:57.049021 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.048972 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-778wr" event={"ID":"79d360cf-60bc-4bbe-ab0a-2832dd974cde","Type":"ContainerDied","Data":"85a3889a68d05164852c841e1caaa22c5fbf5dddc9dc321ca4a77edf54239f74"} Apr 17 14:07:57.050308 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.050225 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmjrq" event={"ID":"786788f0-7365-4b9c-9628-78838c53bc50","Type":"ContainerStarted","Data":"6e6bedea67cd8e11aefbba7babbcffaf1ef463cb5534fc7728b434fc092b8cd4"} Apr 17 14:07:57.051743 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.051711 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mq44v" event={"ID":"65df7e4a-6219-433f-b614-258be054188a","Type":"ContainerStarted","Data":"24d5485db2565dcb228f4c3a070ebf0d7ce56edac77784546afa5a3db1d183ae"} Apr 17 14:07:57.059068 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.059027 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fmns5" podStartSLOduration=3.269806113 podStartE2EDuration="20.059013131s" podCreationTimestamp="2026-04-17 14:07:37 +0000 UTC" firstStartedPulling="2026-04-17 14:07:39.747478723 +0000 UTC m=+3.384056091" lastFinishedPulling="2026-04-17 14:07:56.536685726 +0000 UTC m=+20.173263109" observedRunningTime="2026-04-17 14:07:57.058792924 +0000 UTC m=+20.695370326" watchObservedRunningTime="2026-04-17 14:07:57.059013131 +0000 UTC m=+20.695590747" Apr 17 14:07:57.059785 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.059744 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-158.ec2.internal" podStartSLOduration=19.059733239 podStartE2EDuration="19.059733239s" podCreationTimestamp="2026-04-17 14:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:07:42.032595533 +0000 UTC m=+5.669172924" watchObservedRunningTime="2026-04-17 14:07:57.059733239 +0000 UTC m=+20.696310633" Apr 17 14:07:57.073307 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.073269 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cmjrq" podStartSLOduration=3.254657589 podStartE2EDuration="20.073257723s" podCreationTimestamp="2026-04-17 14:07:37 +0000 UTC" firstStartedPulling="2026-04-17 14:07:39.752831944 +0000 UTC m=+3.389409312" lastFinishedPulling="2026-04-17 14:07:56.571432078 +0000 UTC m=+20.208009446" observedRunningTime="2026-04-17 14:07:57.07288167 +0000 UTC m=+20.709459061" watchObservedRunningTime="2026-04-17 14:07:57.073257723 +0000 UTC m=+20.709835112" Apr 17 14:07:57.106856 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.106811 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mq44v" podStartSLOduration=3.325411194 podStartE2EDuration="20.106797325s" podCreationTimestamp="2026-04-17 14:07:37 +0000 UTC" firstStartedPulling="2026-04-17 14:07:39.752918708 +0000 UTC m=+3.389496077" lastFinishedPulling="2026-04-17 14:07:56.534304826 +0000 UTC m=+20.170882208" observedRunningTime="2026-04-17 14:07:57.106497785 +0000 UTC m=+20.743075176" watchObservedRunningTime="2026-04-17 14:07:57.106797325 +0000 UTC m=+20.743374718" Apr 17 14:07:57.121421 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.121381 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ztcfx" podStartSLOduration=3.312900739 podStartE2EDuration="20.121366985s" podCreationTimestamp="2026-04-17 14:07:37 +0000 UTC" firstStartedPulling="2026-04-17 14:07:39.726121812 +0000 UTC m=+3.362699179" lastFinishedPulling="2026-04-17 14:07:56.534588057 +0000 UTC m=+20.171165425" observedRunningTime="2026-04-17 14:07:57.120946777 +0000 UTC m=+20.757524168" watchObservedRunningTime="2026-04-17 14:07:57.121366985 +0000 UTC m=+20.757944374" Apr 17 14:07:57.731097 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.731065 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 14:07:57.897805 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.897681 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T14:07:57.731080509Z","UUID":"0924b567-d97a-409c-ac31-51c728ee707d","Handler":null,"Name":"","Endpoint":""} Apr 17 14:07:57.900577 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.900558 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 14:07:57.900693 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:57.900586 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 14:07:58.055165 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:58.055131 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" event={"ID":"fa5741d3-e0ce-42fe-9791-6fce2bd6da17","Type":"ContainerStarted","Data":"fa52e0ab9db7cab60db0a81dfb93121db59398595ae339601e0d42f45d1a7e21"} Apr 17 14:07:58.058195 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:58.058166 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" event={"ID":"dccd1bed-f8d5-4c16-968b-e828fa6150a1","Type":"ContainerStarted","Data":"3bc8ca52d1543f8a7ec20b655ef65ec8cab32f78005eb2995971c21621a1066e"} Apr 17 14:07:58.058331 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:58.058202 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" event={"ID":"dccd1bed-f8d5-4c16-968b-e828fa6150a1","Type":"ContainerStarted","Data":"b1be1b9bdb81a96be4a3196a850b6f835ef7c809240f16d765b65f537efdffbb"} Apr 17 14:07:58.059541 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:58.059493 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nczp7" event={"ID":"a2839c1e-60df-4132-aed5-549b23baa1fb","Type":"ContainerStarted","Data":"ad0d0cbc2a309b1b6aef4bb901991449c15b970ad30e3afc43c30dcd4ac67deb"} Apr 17 14:07:58.073301 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:58.073257 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-nczp7" podStartSLOduration=4.26206327 podStartE2EDuration="21.073245088s" podCreationTimestamp="2026-04-17 14:07:37 +0000 UTC" firstStartedPulling="2026-04-17 14:07:39.723103342 +0000 UTC m=+3.359680713" lastFinishedPulling="2026-04-17 14:07:56.534285149 +0000 UTC m=+20.170862531" observedRunningTime="2026-04-17 14:07:58.072988717 +0000 UTC m=+21.709566108" watchObservedRunningTime="2026-04-17 14:07:58.073245088 +0000 UTC m=+21.709822478" Apr 17 14:07:58.073414 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:58.073337 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5mtb9" podStartSLOduration=4.264140061 podStartE2EDuration="21.073332561s" podCreationTimestamp="2026-04-17 14:07:37 +0000 UTC" firstStartedPulling="2026-04-17 14:07:39.725428228 +0000 UTC m=+3.362005597" lastFinishedPulling="2026-04-17 14:07:56.534620719 +0000 UTC m=+20.171198097" observedRunningTime="2026-04-17 14:07:57.135746048 +0000 UTC m=+20.772323439" watchObservedRunningTime="2026-04-17 14:07:58.073332561 +0000 UTC m=+21.709909953" Apr 17 14:07:58.891632 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:58.891555 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:07:58.892292 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:58.892271 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:07:58.944814 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:58.944787 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:07:58.944944 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:58.944791 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:07:58.944944 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:58.944905 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:07:58.945026 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:07:58.944954 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:07:59.063238 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:59.063202 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" event={"ID":"fa5741d3-e0ce-42fe-9791-6fce2bd6da17","Type":"ContainerStarted","Data":"f0b7b38d4bd442f489c60b4cc26c27e0bde24728eee609d5c0b305bb8c47d3c2"} Apr 17 14:07:59.079353 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:07:59.079308 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gjdx8" podStartSLOduration=3.23330589 podStartE2EDuration="22.079295013s" podCreationTimestamp="2026-04-17 14:07:37 +0000 UTC" firstStartedPulling="2026-04-17 14:07:39.752955747 +0000 UTC m=+3.389533125" lastFinishedPulling="2026-04-17 14:07:58.598944819 +0000 UTC m=+22.235522248" observedRunningTime="2026-04-17 14:07:59.079179996 +0000 UTC m=+22.715757387" watchObservedRunningTime="2026-04-17 14:07:59.079295013 +0000 UTC m=+22.715872404" Apr 17 14:08:00.068978 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:00.068791 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" event={"ID":"dccd1bed-f8d5-4c16-968b-e828fa6150a1","Type":"ContainerStarted","Data":"ad174d5327c33fffd27cf7267886121be5a78443c11aed154a20fd7f3fbb3530"} Apr 17 14:08:00.068978 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:00.068841 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 14:08:00.463149 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:00.463076 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:08:00.463692 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:00.463674 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5mtb9" Apr 17 14:08:00.944668 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:00.944585 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:00.944822 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:00.944597 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:08:00.944822 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:00.944712 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:08:00.944822 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:00.944791 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:08:02.075727 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:02.075482 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" event={"ID":"dccd1bed-f8d5-4c16-968b-e828fa6150a1","Type":"ContainerStarted","Data":"dcf9f6ce2c5154a30946c170f97f739273c39d0f4b5d812786bc104b0796456e"} Apr 17 14:08:02.076546 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:02.075770 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:08:02.076546 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:02.075798 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:08:02.077176 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:02.077135 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-778wr" event={"ID":"79d360cf-60bc-4bbe-ab0a-2832dd974cde","Type":"ContainerStarted","Data":"f712a2a306dd89a8994099d48f6862514cc3e6c62abe3cc8755f341909f1631c"} Apr 17 14:08:02.090105 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:02.090028 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:08:02.100914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:02.100871 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" podStartSLOduration=8.142285774 podStartE2EDuration="25.100859118s" podCreationTimestamp="2026-04-17 14:07:37 +0000 UTC" firstStartedPulling="2026-04-17 14:07:39.726546279 +0000 UTC m=+3.363123660" lastFinishedPulling="2026-04-17 14:07:56.685119621 +0000 UTC m=+20.321697004" observedRunningTime="2026-04-17 14:08:02.099610583 +0000 UTC m=+25.736187972" watchObservedRunningTime="2026-04-17 14:08:02.100859118 +0000 UTC m=+25.737436508" Apr 17 14:08:02.945628 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:02.945598 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:02.945628 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:02.945616 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:08:02.945854 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:02.945716 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:08:02.945916 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:02.945891 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:08:03.080634 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:03.080597 2568 generic.go:358] "Generic (PLEG): container finished" podID="79d360cf-60bc-4bbe-ab0a-2832dd974cde" containerID="f712a2a306dd89a8994099d48f6862514cc3e6c62abe3cc8755f341909f1631c" exitCode=0 Apr 17 14:08:03.081063 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:03.080678 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-778wr" event={"ID":"79d360cf-60bc-4bbe-ab0a-2832dd974cde","Type":"ContainerDied","Data":"f712a2a306dd89a8994099d48f6862514cc3e6c62abe3cc8755f341909f1631c"} Apr 17 14:08:03.081136 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:03.081115 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:08:03.094944 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:03.094924 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:08:03.558166 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:03.557567 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f6d89"] Apr 17 14:08:03.558166 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:03.557694 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:08:03.558166 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:03.557814 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:08:03.558995 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:03.558883 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mcl6c"] Apr 17 14:08:03.559146 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:03.559019 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:03.559146 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:03.559110 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:08:04.945501 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:04.945461 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:08:04.945935 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:04.945522 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:04.945935 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:04.945594 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:08:04.945935 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:04.945734 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:08:05.087540 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:05.087358 2568 generic.go:358] "Generic (PLEG): container finished" podID="79d360cf-60bc-4bbe-ab0a-2832dd974cde" containerID="cd2514e05b91d480ae13e96299b05081979997bbd938418567a355fd884de633" exitCode=0 Apr 17 14:08:05.087680 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:05.087441 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-778wr" event={"ID":"79d360cf-60bc-4bbe-ab0a-2832dd974cde","Type":"ContainerDied","Data":"cd2514e05b91d480ae13e96299b05081979997bbd938418567a355fd884de633"} Apr 17 14:08:06.945934 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:06.945904 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:06.946316 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:06.945994 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:08:06.946316 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:06.946071 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:08:06.946316 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:06.946159 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:08:07.092192 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:07.092162 2568 generic.go:358] "Generic (PLEG): container finished" podID="79d360cf-60bc-4bbe-ab0a-2832dd974cde" containerID="5afa5ec09cae2923bd7628933bd60374b2b623545674e138eaa87a3606f523b6" exitCode=0 Apr 17 14:08:07.092330 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:07.092207 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-778wr" event={"ID":"79d360cf-60bc-4bbe-ab0a-2832dd974cde","Type":"ContainerDied","Data":"5afa5ec09cae2923bd7628933bd60374b2b623545674e138eaa87a3606f523b6"} Apr 17 14:08:08.945204 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:08.945171 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:08.945204 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:08.945190 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:08:08.945679 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:08.945281 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mcl6c" podUID="080d6200-63b7-4e65-8d68-ea319212caed" Apr 17 14:08:08.945679 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:08.945435 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6d89" podUID="497bbc82-edab-4d97-bcc8-7d428e62da1e" Apr 17 14:08:09.734573 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.734541 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-158.ec2.internal" event="NodeReady" Apr 17 14:08:09.734789 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.734687 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 14:08:09.775764 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.775697 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jscrk"] Apr 17 14:08:09.793281 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.793251 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qf25s"] Apr 17 14:08:09.793428 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.793406 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.795941 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.795714 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 14:08:09.795941 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.795738 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 14:08:09.795941 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.795795 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fnmsv\"" Apr 17 14:08:09.812733 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.812709 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jscrk"] Apr 17 14:08:09.812825 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.812738 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qf25s"] Apr 17 14:08:09.812874 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.812848 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:09.815120 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.815100 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 14:08:09.815220 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.815143 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 14:08:09.815220 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.815158 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 14:08:09.815220 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.815105 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nsnr8\"" Apr 17 14:08:09.873884 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.873862 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.874021 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.873889 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-tmp-dir\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.874021 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.873914 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6x8q\" (UniqueName: \"kubernetes.io/projected/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-kube-api-access-p6x8q\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.874021 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.873978 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bk72\" (UniqueName: \"kubernetes.io/projected/b7c15494-fd82-415f-967e-b8bf2220ef8a-kube-api-access-8bk72\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:09.874165 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.874035 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:09.874165 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.874060 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-config-volume\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.975167 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.975131 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.975671 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.975173 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-tmp-dir\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.975671 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.975201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6x8q\" (UniqueName: \"kubernetes.io/projected/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-kube-api-access-p6x8q\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.975671 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.975237 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bk72\" (UniqueName: \"kubernetes.io/projected/b7c15494-fd82-415f-967e-b8bf2220ef8a-kube-api-access-8bk72\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:09.975671 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.975269 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:09.975671 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.975291 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-config-volume\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.975671 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:09.975290 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:08:09.975671 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:09.975378 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls podName:2742c3d7-a7f7-4525-b1be-30a78e5cec2f nodeName:}" failed. No retries permitted until 2026-04-17 14:08:10.47535726 +0000 UTC m=+34.111934629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls") pod "dns-default-jscrk" (UID: "2742c3d7-a7f7-4525-b1be-30a78e5cec2f") : secret "dns-default-metrics-tls" not found Apr 17 14:08:09.975671 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.975525 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-tmp-dir\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.975671 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:09.975619 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:08:09.976020 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:09.975688 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert podName:b7c15494-fd82-415f-967e-b8bf2220ef8a nodeName:}" failed. No retries permitted until 2026-04-17 14:08:10.47567133 +0000 UTC m=+34.112248702 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert") pod "ingress-canary-qf25s" (UID: "b7c15494-fd82-415f-967e-b8bf2220ef8a") : secret "canary-serving-cert" not found Apr 17 14:08:09.976020 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.975804 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-config-volume\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.985845 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.985793 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6x8q\" (UniqueName: \"kubernetes.io/projected/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-kube-api-access-p6x8q\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:09.985945 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:09.985866 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bk72\" (UniqueName: \"kubernetes.io/projected/b7c15494-fd82-415f-967e-b8bf2220ef8a-kube-api-access-8bk72\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:10.477792 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:10.477752 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:10.477994 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:10.477808 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:10.477994 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:10.477921 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:08:10.478093 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:10.478001 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls podName:2742c3d7-a7f7-4525-b1be-30a78e5cec2f nodeName:}" failed. No retries permitted until 2026-04-17 14:08:11.47798063 +0000 UTC m=+35.114558007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls") pod "dns-default-jscrk" (UID: "2742c3d7-a7f7-4525-b1be-30a78e5cec2f") : secret "dns-default-metrics-tls" not found Apr 17 14:08:10.478093 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:10.477929 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:08:10.478093 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:10.478064 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert podName:b7c15494-fd82-415f-967e-b8bf2220ef8a nodeName:}" failed. No retries permitted until 2026-04-17 14:08:11.478047264 +0000 UTC m=+35.114624633 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert") pod "ingress-canary-qf25s" (UID: "b7c15494-fd82-415f-967e-b8bf2220ef8a") : secret "canary-serving-cert" not found Apr 17 14:08:10.578260 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:10.578216 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:08:10.578431 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:10.578369 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:08:10.578476 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:10.578439 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs podName:497bbc82-edab-4d97-bcc8-7d428e62da1e nodeName:}" failed. No retries permitted until 2026-04-17 14:08:42.578418616 +0000 UTC m=+66.214995989 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs") pod "network-metrics-daemon-f6d89" (UID: "497bbc82-edab-4d97-bcc8-7d428e62da1e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:08:10.679012 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:10.678977 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25rc5\" (UniqueName: \"kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5\") pod \"network-check-target-mcl6c\" (UID: \"080d6200-63b7-4e65-8d68-ea319212caed\") " pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:10.679170 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:10.679136 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:08:10.679170 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:10.679154 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:08:10.679170 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:10.679165 2568 projected.go:194] Error preparing data for projected volume kube-api-access-25rc5 for pod openshift-network-diagnostics/network-check-target-mcl6c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:08:10.679266 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:10.679221 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5 podName:080d6200-63b7-4e65-8d68-ea319212caed nodeName:}" failed. No retries permitted until 2026-04-17 14:08:42.679204041 +0000 UTC m=+66.315781424 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-25rc5" (UniqueName: "kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5") pod "network-check-target-mcl6c" (UID: "080d6200-63b7-4e65-8d68-ea319212caed") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:08:10.944889 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:10.944854 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:08:10.945076 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:10.944854 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:10.947405 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:10.947385 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:08:10.948268 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:10.948250 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x9vkl\"" Apr 17 14:08:10.948381 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:10.948252 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:08:10.948583 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:10.948567 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bfspf\"" Apr 17 14:08:10.948656 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:10.948641 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:08:11.485814 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:11.485775 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:11.486416 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:11.485847 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:11.486416 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:11.485951 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:08:11.486416 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:11.486019 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert podName:b7c15494-fd82-415f-967e-b8bf2220ef8a nodeName:}" failed. No retries permitted until 2026-04-17 14:08:13.486004495 +0000 UTC m=+37.122581876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert") pod "ingress-canary-qf25s" (UID: "b7c15494-fd82-415f-967e-b8bf2220ef8a") : secret "canary-serving-cert" not found Apr 17 14:08:11.486416 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:11.485952 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:08:11.486416 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:11.486093 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls podName:2742c3d7-a7f7-4525-b1be-30a78e5cec2f nodeName:}" failed. No retries permitted until 2026-04-17 14:08:13.486079218 +0000 UTC m=+37.122656610 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls") pod "dns-default-jscrk" (UID: "2742c3d7-a7f7-4525-b1be-30a78e5cec2f") : secret "dns-default-metrics-tls" not found Apr 17 14:08:13.501391 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:13.501226 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:13.501752 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:13.501431 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:13.501752 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:13.501367 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:08:13.501752 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:13.501524 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert podName:b7c15494-fd82-415f-967e-b8bf2220ef8a nodeName:}" failed. No retries permitted until 2026-04-17 14:08:17.501493538 +0000 UTC m=+41.138070911 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert") pod "ingress-canary-qf25s" (UID: "b7c15494-fd82-415f-967e-b8bf2220ef8a") : secret "canary-serving-cert" not found Apr 17 14:08:13.501752 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:13.501527 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:08:13.501752 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:13.501574 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls podName:2742c3d7-a7f7-4525-b1be-30a78e5cec2f nodeName:}" failed. No retries permitted until 2026-04-17 14:08:17.501563061 +0000 UTC m=+41.138140430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls") pod "dns-default-jscrk" (UID: "2742c3d7-a7f7-4525-b1be-30a78e5cec2f") : secret "dns-default-metrics-tls" not found Apr 17 14:08:14.106900 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:14.106863 2568 generic.go:358] "Generic (PLEG): container finished" podID="79d360cf-60bc-4bbe-ab0a-2832dd974cde" containerID="4a09180e4a5ba49336e69ee6f76cadaac75d734f2ec751cee5072734263f785b" exitCode=0 Apr 17 14:08:14.107050 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:14.106927 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-778wr" event={"ID":"79d360cf-60bc-4bbe-ab0a-2832dd974cde","Type":"ContainerDied","Data":"4a09180e4a5ba49336e69ee6f76cadaac75d734f2ec751cee5072734263f785b"} Apr 17 14:08:15.110952 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:15.110916 2568 generic.go:358] "Generic (PLEG): container finished" podID="79d360cf-60bc-4bbe-ab0a-2832dd974cde" containerID="75401b3845e526387eb29c465054e153910f073e0695b5b73763d09706128415" exitCode=0 Apr 17 14:08:15.111323 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:15.110984 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-778wr" event={"ID":"79d360cf-60bc-4bbe-ab0a-2832dd974cde","Type":"ContainerDied","Data":"75401b3845e526387eb29c465054e153910f073e0695b5b73763d09706128415"} Apr 17 14:08:16.115722 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:16.115682 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-778wr" event={"ID":"79d360cf-60bc-4bbe-ab0a-2832dd974cde","Type":"ContainerStarted","Data":"7dec3dcc7ba091b8445e17a7c7cc293d8a67b8bc61ab1df4b0226ebc34d8b0f3"} Apr 17 14:08:16.135857 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:16.135808 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-778wr" podStartSLOduration=5.703806396 podStartE2EDuration="39.135790833s" podCreationTimestamp="2026-04-17 14:07:37 +0000 UTC" firstStartedPulling="2026-04-17 14:07:39.722886327 +0000 UTC m=+3.359463698" lastFinishedPulling="2026-04-17 14:08:13.154870766 +0000 UTC m=+36.791448135" observedRunningTime="2026-04-17 14:08:16.134368476 +0000 UTC m=+39.770945866" watchObservedRunningTime="2026-04-17 14:08:16.135790833 +0000 UTC m=+39.772368224" Apr 17 14:08:17.533188 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:17.533145 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:17.533712 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:17.533211 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:17.533712 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:17.533298 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:08:17.533712 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:17.533308 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:08:17.533712 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:17.533361 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls podName:2742c3d7-a7f7-4525-b1be-30a78e5cec2f nodeName:}" failed. No retries permitted until 2026-04-17 14:08:25.533346627 +0000 UTC m=+49.169923995 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls") pod "dns-default-jscrk" (UID: "2742c3d7-a7f7-4525-b1be-30a78e5cec2f") : secret "dns-default-metrics-tls" not found Apr 17 14:08:17.533712 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:17.533376 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert podName:b7c15494-fd82-415f-967e-b8bf2220ef8a nodeName:}" failed. No retries permitted until 2026-04-17 14:08:25.533369538 +0000 UTC m=+49.169946905 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert") pod "ingress-canary-qf25s" (UID: "b7c15494-fd82-415f-967e-b8bf2220ef8a") : secret "canary-serving-cert" not found Apr 17 14:08:25.587818 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:25.587771 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:25.588238 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:25.587834 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:25.588238 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:25.587933 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:08:25.588238 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:25.588005 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert podName:b7c15494-fd82-415f-967e-b8bf2220ef8a nodeName:}" failed. No retries permitted until 2026-04-17 14:08:41.587989806 +0000 UTC m=+65.224567174 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert") pod "ingress-canary-qf25s" (UID: "b7c15494-fd82-415f-967e-b8bf2220ef8a") : secret "canary-serving-cert" not found Apr 17 14:08:25.588238 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:25.587935 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:08:25.588238 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:25.588101 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls podName:2742c3d7-a7f7-4525-b1be-30a78e5cec2f nodeName:}" failed. No retries permitted until 2026-04-17 14:08:41.58808145 +0000 UTC m=+65.224658836 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls") pod "dns-default-jscrk" (UID: "2742c3d7-a7f7-4525-b1be-30a78e5cec2f") : secret "dns-default-metrics-tls" not found Apr 17 14:08:35.101200 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:35.101172 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-brxr6" Apr 17 14:08:41.598170 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:41.598122 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:08:41.598568 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:41.598216 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:08:41.598568 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:41.598287 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:08:41.598568 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:41.598304 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:08:41.598568 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:41.598353 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert podName:b7c15494-fd82-415f-967e-b8bf2220ef8a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:13.598338797 +0000 UTC m=+97.234916168 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert") pod "ingress-canary-qf25s" (UID: "b7c15494-fd82-415f-967e-b8bf2220ef8a") : secret "canary-serving-cert" not found Apr 17 14:08:41.598568 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:41.598367 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls podName:2742c3d7-a7f7-4525-b1be-30a78e5cec2f nodeName:}" failed. No retries permitted until 2026-04-17 14:09:13.598361873 +0000 UTC m=+97.234939240 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls") pod "dns-default-jscrk" (UID: "2742c3d7-a7f7-4525-b1be-30a78e5cec2f") : secret "dns-default-metrics-tls" not found Apr 17 14:08:42.605501 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:42.605453 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:08:42.607906 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:42.607887 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:08:42.615835 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:42.615817 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:08:42.615902 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:08:42.615881 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs podName:497bbc82-edab-4d97-bcc8-7d428e62da1e nodeName:}" failed. No retries permitted until 2026-04-17 14:09:46.6158665 +0000 UTC m=+130.252443868 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs") pod "network-metrics-daemon-f6d89" (UID: "497bbc82-edab-4d97-bcc8-7d428e62da1e") : secret "metrics-daemon-secret" not found Apr 17 14:08:42.706265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:42.706219 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25rc5\" (UniqueName: \"kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5\") pod \"network-check-target-mcl6c\" (UID: \"080d6200-63b7-4e65-8d68-ea319212caed\") " pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:42.708696 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:42.708677 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:08:42.719311 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:42.719291 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:08:42.730842 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:42.730824 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25rc5\" (UniqueName: \"kubernetes.io/projected/080d6200-63b7-4e65-8d68-ea319212caed-kube-api-access-25rc5\") pod \"network-check-target-mcl6c\" (UID: \"080d6200-63b7-4e65-8d68-ea319212caed\") " pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:42.764133 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:42.764110 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x9vkl\"" Apr 17 14:08:42.772807 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:42.772791 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:42.895295 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:42.895108 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mcl6c"] Apr 17 14:08:42.898241 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:08:42.898213 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod080d6200_63b7_4e65_8d68_ea319212caed.slice/crio-fb402b7f6d7ed82b2d825a3980a5e3a08623cff1c0eca43363497c4281dfa267 WatchSource:0}: Error finding container fb402b7f6d7ed82b2d825a3980a5e3a08623cff1c0eca43363497c4281dfa267: Status 404 returned error can't find the container with id fb402b7f6d7ed82b2d825a3980a5e3a08623cff1c0eca43363497c4281dfa267 Apr 17 14:08:43.164568 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:43.164533 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mcl6c" event={"ID":"080d6200-63b7-4e65-8d68-ea319212caed","Type":"ContainerStarted","Data":"fb402b7f6d7ed82b2d825a3980a5e3a08623cff1c0eca43363497c4281dfa267"} Apr 17 14:08:46.172051 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:46.172022 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mcl6c" event={"ID":"080d6200-63b7-4e65-8d68-ea319212caed","Type":"ContainerStarted","Data":"714974d62212fd6f7d29b4516d6d13b452a6bbeb0f3e903bed49a897bcdb0ccc"} Apr 17 14:08:46.172490 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:46.172163 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:08:46.187395 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:08:46.187351 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mcl6c" podStartSLOduration=67.529615747 podStartE2EDuration="1m10.187338738s" podCreationTimestamp="2026-04-17 14:07:36 +0000 UTC" firstStartedPulling="2026-04-17 14:08:42.900533944 +0000 UTC m=+66.537111312" lastFinishedPulling="2026-04-17 14:08:45.558256936 +0000 UTC m=+69.194834303" observedRunningTime="2026-04-17 14:08:46.18629177 +0000 UTC m=+69.822869161" watchObservedRunningTime="2026-04-17 14:08:46.187338738 +0000 UTC m=+69.823916161" Apr 17 14:09:12.171299 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.171266 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-684d7b5578-8dvb5"] Apr 17 14:09:12.175353 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.175337 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.177805 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.177771 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 14:09:12.177805 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.177792 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 14:09:12.177805 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.177805 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 14:09:12.178008 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.177863 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 14:09:12.178008 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.177792 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-4crr7\"" Apr 17 14:09:12.178008 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.177965 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 14:09:12.178008 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.178007 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 14:09:12.185914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.185824 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-684d7b5578-8dvb5"] Apr 17 14:09:12.206534 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.206501 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-default-certificate\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.206644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.206547 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-stats-auth\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.206644 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.206564 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.206736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.206654 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5c88\" (UniqueName: \"kubernetes.io/projected/99c53420-8717-4539-b474-bdf6d9f5615a-kube-api-access-c5c88\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.206736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.206695 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.272911 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.272887 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rktb8"] Apr 17 14:09:12.275582 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.275568 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.277778 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.277758 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 14:09:12.277778 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.277774 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 14:09:12.277901 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.277766 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 14:09:12.278103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.278086 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 14:09:12.278170 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.278155 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-v5nhj\"" Apr 17 14:09:12.282076 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.281918 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 14:09:12.283209 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.283190 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rktb8"] Apr 17 14:09:12.307805 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.307777 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.307887 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.307809 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-default-certificate\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.307887 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.307833 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-serving-cert\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.307887 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.307857 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-stats-auth\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.307989 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:12.307942 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:12.807924957 +0000 UTC m=+96.444502351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : configmap references non-existent config key: service-ca.crt Apr 17 14:09:12.307989 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.307970 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.308057 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.307999 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.308057 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.308018 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-tmp\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.308057 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.308036 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5cvd\" (UniqueName: \"kubernetes.io/projected/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-kube-api-access-f5cvd\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.308159 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.308080 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-service-ca-bundle\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.308159 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.308106 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5c88\" (UniqueName: \"kubernetes.io/projected/99c53420-8717-4539-b474-bdf6d9f5615a-kube-api-access-c5c88\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.308159 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:12.308114 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:09:12.308159 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.308129 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-snapshots\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.308307 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:12.308176 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:12.808159959 +0000 UTC m=+96.444737330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : secret "router-metrics-certs-default" not found Apr 17 14:09:12.310205 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.310185 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-stats-auth\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.310301 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.310211 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-default-certificate\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.315195 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.315175 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5c88\" (UniqueName: \"kubernetes.io/projected/99c53420-8717-4539-b474-bdf6d9f5615a-kube-api-access-c5c88\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.408975 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.408938 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-snapshots\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.409160 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.409018 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-serving-cert\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.409160 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.409051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.409160 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.409069 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-tmp\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.409160 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.409086 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5cvd\" (UniqueName: \"kubernetes.io/projected/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-kube-api-access-f5cvd\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.409160 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.409138 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-service-ca-bundle\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.409659 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.409638 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-tmp\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.409762 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.409704 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-service-ca-bundle\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.409856 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.409834 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-snapshots\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.410222 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.410205 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.411310 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.411292 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-serving-cert\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.417639 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.417608 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5cvd\" (UniqueName: \"kubernetes.io/projected/61e203fc-fb18-4d79-ace2-a4218ed4e9d9-kube-api-access-f5cvd\") pod \"insights-operator-585dfdc468-rktb8\" (UID: \"61e203fc-fb18-4d79-ace2-a4218ed4e9d9\") " pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.585120 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.585025 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-rktb8" Apr 17 14:09:12.709155 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.709124 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rktb8"] Apr 17 14:09:12.712255 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:09:12.712229 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61e203fc_fb18_4d79_ace2_a4218ed4e9d9.slice/crio-bb88f3f47b19dbb2ece6ded813c1b2a0bdb40e21fcfa908dd3fa2827f662fc19 WatchSource:0}: Error finding container bb88f3f47b19dbb2ece6ded813c1b2a0bdb40e21fcfa908dd3fa2827f662fc19: Status 404 returned error can't find the container with id bb88f3f47b19dbb2ece6ded813c1b2a0bdb40e21fcfa908dd3fa2827f662fc19 Apr 17 14:09:12.812567 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.812533 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.812724 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:12.812631 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:12.812724 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:12.812684 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:09:12.812797 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:12.812753 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:13.81273782 +0000 UTC m=+97.449315189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : configmap references non-existent config key: service-ca.crt Apr 17 14:09:12.812797 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:12.812769 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:13.812763165 +0000 UTC m=+97.449340533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : secret "router-metrics-certs-default" not found Apr 17 14:09:13.223482 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:13.223449 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rktb8" event={"ID":"61e203fc-fb18-4d79-ace2-a4218ed4e9d9","Type":"ContainerStarted","Data":"bb88f3f47b19dbb2ece6ded813c1b2a0bdb40e21fcfa908dd3fa2827f662fc19"} Apr 17 14:09:13.617562 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:13.617525 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:09:13.617754 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:13.617604 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:09:13.617754 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:13.617674 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:09:13.617754 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:13.617723 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:09:13.617754 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:13.617747 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls podName:2742c3d7-a7f7-4525-b1be-30a78e5cec2f nodeName:}" failed. No retries permitted until 2026-04-17 14:10:17.617728342 +0000 UTC m=+161.254305719 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls") pod "dns-default-jscrk" (UID: "2742c3d7-a7f7-4525-b1be-30a78e5cec2f") : secret "dns-default-metrics-tls" not found Apr 17 14:09:13.617909 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:13.617766 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert podName:b7c15494-fd82-415f-967e-b8bf2220ef8a nodeName:}" failed. No retries permitted until 2026-04-17 14:10:17.617756422 +0000 UTC m=+161.254333807 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert") pod "ingress-canary-qf25s" (UID: "b7c15494-fd82-415f-967e-b8bf2220ef8a") : secret "canary-serving-cert" not found Apr 17 14:09:13.819536 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:13.819492 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:13.819708 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:13.819543 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:13.819708 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:13.819640 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:09:13.819708 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:13.819671 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:15.819649894 +0000 UTC m=+99.456227263 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : configmap references non-existent config key: service-ca.crt Apr 17 14:09:13.819708 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:13.819692 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:15.819685815 +0000 UTC m=+99.456263183 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : secret "router-metrics-certs-default" not found Apr 17 14:09:15.228925 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:15.228837 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rktb8" event={"ID":"61e203fc-fb18-4d79-ace2-a4218ed4e9d9","Type":"ContainerStarted","Data":"6bece5bfe96943af3e9047154ba7946736d5c52697a1d35b6c4bcb645b51b500"} Apr 17 14:09:15.245464 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:15.245410 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-rktb8" podStartSLOduration=1.006183478 podStartE2EDuration="3.245396115s" podCreationTimestamp="2026-04-17 14:09:12 +0000 UTC" firstStartedPulling="2026-04-17 14:09:12.713898079 +0000 UTC m=+96.350475447" lastFinishedPulling="2026-04-17 14:09:14.953110716 +0000 UTC m=+98.589688084" observedRunningTime="2026-04-17 14:09:15.244105505 +0000 UTC m=+98.880682913" watchObservedRunningTime="2026-04-17 14:09:15.245396115 +0000 UTC m=+98.881973531" Apr 17 14:09:15.834683 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:15.834644 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:15.834894 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:15.834694 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:15.834894 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:15.834846 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:19.834822438 +0000 UTC m=+103.471399826 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : configmap references non-existent config key: service-ca.crt Apr 17 14:09:15.834894 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:15.834865 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:09:15.835055 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:15.834926 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:19.83491242 +0000 UTC m=+103.471489787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : secret "router-metrics-certs-default" not found Apr 17 14:09:17.176834 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:17.176799 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mcl6c" Apr 17 14:09:18.951176 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:18.951145 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ztcfx_44d13d23-0ead-4ceb-b841-467a36463db2/dns-node-resolver/0.log" Apr 17 14:09:19.752025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:19.751997 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mq44v_65df7e4a-6219-433f-b614-258be054188a/node-ca/0.log" Apr 17 14:09:19.861330 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:19.861287 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:19.861471 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:19.861399 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:19.861471 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:19.861435 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:09:19.861600 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:19.861542 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:27.861502091 +0000 UTC m=+111.498079478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : secret "router-metrics-certs-default" not found Apr 17 14:09:19.861600 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:19.861559 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:27.861551578 +0000 UTC m=+111.498128946 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : configmap references non-existent config key: service-ca.crt Apr 17 14:09:21.222659 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.222623 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd"] Apr 17 14:09:21.227725 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.227708 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" Apr 17 14:09:21.229829 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.229802 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 14:09:21.229965 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.229880 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 14:09:21.230675 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.230659 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-hw846\"" Apr 17 14:09:21.230756 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.230685 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:09:21.230815 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.230769 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 14:09:21.235017 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.234997 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd"] Apr 17 14:09:21.272918 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.272883 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mp9b\" (UniqueName: \"kubernetes.io/projected/07693eac-becc-4d74-9e6b-24a018ef1f41-kube-api-access-2mp9b\") pod \"kube-storage-version-migrator-operator-6769c5d45-67kpd\" (UID: \"07693eac-becc-4d74-9e6b-24a018ef1f41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" Apr 17 14:09:21.273096 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.272945 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07693eac-becc-4d74-9e6b-24a018ef1f41-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-67kpd\" (UID: \"07693eac-becc-4d74-9e6b-24a018ef1f41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" Apr 17 14:09:21.273096 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.272985 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07693eac-becc-4d74-9e6b-24a018ef1f41-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-67kpd\" (UID: \"07693eac-becc-4d74-9e6b-24a018ef1f41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" Apr 17 14:09:21.373325 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.373286 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07693eac-becc-4d74-9e6b-24a018ef1f41-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-67kpd\" (UID: \"07693eac-becc-4d74-9e6b-24a018ef1f41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" Apr 17 14:09:21.373500 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.373388 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mp9b\" (UniqueName: \"kubernetes.io/projected/07693eac-becc-4d74-9e6b-24a018ef1f41-kube-api-access-2mp9b\") pod \"kube-storage-version-migrator-operator-6769c5d45-67kpd\" (UID: \"07693eac-becc-4d74-9e6b-24a018ef1f41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" Apr 17 14:09:21.373500 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.373431 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07693eac-becc-4d74-9e6b-24a018ef1f41-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-67kpd\" (UID: \"07693eac-becc-4d74-9e6b-24a018ef1f41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" Apr 17 14:09:21.373967 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.373946 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07693eac-becc-4d74-9e6b-24a018ef1f41-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-67kpd\" (UID: \"07693eac-becc-4d74-9e6b-24a018ef1f41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" Apr 17 14:09:21.375623 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.375602 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07693eac-becc-4d74-9e6b-24a018ef1f41-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-67kpd\" (UID: \"07693eac-becc-4d74-9e6b-24a018ef1f41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" Apr 17 14:09:21.380914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.380888 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mp9b\" (UniqueName: \"kubernetes.io/projected/07693eac-becc-4d74-9e6b-24a018ef1f41-kube-api-access-2mp9b\") pod \"kube-storage-version-migrator-operator-6769c5d45-67kpd\" (UID: \"07693eac-becc-4d74-9e6b-24a018ef1f41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" Apr 17 14:09:21.537266 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.537190 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" Apr 17 14:09:21.650238 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:21.650198 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd"] Apr 17 14:09:21.654680 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:09:21.654649 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07693eac_becc_4d74_9e6b_24a018ef1f41.slice/crio-7bdc2ef6e325237f89dcab160647bd1fc21ec0f81327e9a5052879d0d5f9ccfd WatchSource:0}: Error finding container 7bdc2ef6e325237f89dcab160647bd1fc21ec0f81327e9a5052879d0d5f9ccfd: Status 404 returned error can't find the container with id 7bdc2ef6e325237f89dcab160647bd1fc21ec0f81327e9a5052879d0d5f9ccfd Apr 17 14:09:22.242257 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:22.242221 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" event={"ID":"07693eac-becc-4d74-9e6b-24a018ef1f41","Type":"ContainerStarted","Data":"7bdc2ef6e325237f89dcab160647bd1fc21ec0f81327e9a5052879d0d5f9ccfd"} Apr 17 14:09:24.248024 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:24.247988 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" event={"ID":"07693eac-becc-4d74-9e6b-24a018ef1f41","Type":"ContainerStarted","Data":"a7975e60b1e2c3f02a8e2b2d28e6726cec71df94f194c8ddb786f87d05d66291"} Apr 17 14:09:24.261954 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:24.261908 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" podStartSLOduration=1.2547586800000001 podStartE2EDuration="3.261895167s" podCreationTimestamp="2026-04-17 14:09:21 +0000 UTC" firstStartedPulling="2026-04-17 14:09:21.656236851 +0000 UTC m=+105.292814223" lastFinishedPulling="2026-04-17 14:09:23.663373341 +0000 UTC m=+107.299950710" observedRunningTime="2026-04-17 14:09:24.260900955 +0000 UTC m=+107.897478357" watchObservedRunningTime="2026-04-17 14:09:24.261895167 +0000 UTC m=+107.898472554" Apr 17 14:09:24.962753 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:24.962717 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rcf78"] Apr 17 14:09:24.965580 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:24.965564 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rcf78" Apr 17 14:09:24.967714 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:24.967692 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-9q9p8\"" Apr 17 14:09:24.971529 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:24.971491 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rcf78"] Apr 17 14:09:24.999627 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:24.999605 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fw4q\" (UniqueName: \"kubernetes.io/projected/f907888f-168a-435b-9326-465634e93710-kube-api-access-9fw4q\") pod \"network-check-source-8894fc9bd-rcf78\" (UID: \"f907888f-168a-435b-9326-465634e93710\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rcf78" Apr 17 14:09:25.100275 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:25.100230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fw4q\" (UniqueName: \"kubernetes.io/projected/f907888f-168a-435b-9326-465634e93710-kube-api-access-9fw4q\") pod \"network-check-source-8894fc9bd-rcf78\" (UID: \"f907888f-168a-435b-9326-465634e93710\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rcf78" Apr 17 14:09:25.107745 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:25.107720 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fw4q\" (UniqueName: \"kubernetes.io/projected/f907888f-168a-435b-9326-465634e93710-kube-api-access-9fw4q\") pod \"network-check-source-8894fc9bd-rcf78\" (UID: \"f907888f-168a-435b-9326-465634e93710\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rcf78" Apr 17 14:09:25.274167 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:25.274080 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rcf78" Apr 17 14:09:25.378732 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:25.378703 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rcf78"] Apr 17 14:09:25.382417 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:09:25.382387 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf907888f_168a_435b_9326_465634e93710.slice/crio-83786d5d6eef996ec390956831e7038ed5744099e99b3343583fa188c2dd0439 WatchSource:0}: Error finding container 83786d5d6eef996ec390956831e7038ed5744099e99b3343583fa188c2dd0439: Status 404 returned error can't find the container with id 83786d5d6eef996ec390956831e7038ed5744099e99b3343583fa188c2dd0439 Apr 17 14:09:26.253216 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:26.253180 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rcf78" event={"ID":"f907888f-168a-435b-9326-465634e93710","Type":"ContainerStarted","Data":"20b61c00daf0fda2b0e8b927b113696a876a658b834f194b4f16839a8ceec3cc"} Apr 17 14:09:26.253216 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:26.253218 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rcf78" event={"ID":"f907888f-168a-435b-9326-465634e93710","Type":"ContainerStarted","Data":"83786d5d6eef996ec390956831e7038ed5744099e99b3343583fa188c2dd0439"} Apr 17 14:09:26.274277 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:26.274223 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rcf78" podStartSLOduration=2.274206341 podStartE2EDuration="2.274206341s" podCreationTimestamp="2026-04-17 14:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:09:26.272993361 +0000 UTC m=+109.909570750" watchObservedRunningTime="2026-04-17 14:09:26.274206341 +0000 UTC m=+109.910783732" Apr 17 14:09:27.878028 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.877998 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2zqlc"] Apr 17 14:09:27.880885 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.880869 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2zqlc" Apr 17 14:09:27.883007 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.882982 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 14:09:27.883123 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.882987 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 14:09:27.884008 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.883983 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 14:09:27.884382 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.884012 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 14:09:27.884658 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.884077 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-zbgbs\"" Apr 17 14:09:27.889404 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.889386 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2zqlc"] Apr 17 14:09:27.922025 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.921998 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:27.922125 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.922036 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/40d81942-12f4-4f8a-8842-de8e2a878a0a-signing-key\") pod \"service-ca-865cb79987-2zqlc\" (UID: \"40d81942-12f4-4f8a-8842-de8e2a878a0a\") " pod="openshift-service-ca/service-ca-865cb79987-2zqlc" Apr 17 14:09:27.922125 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.922054 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbg8z\" (UniqueName: \"kubernetes.io/projected/40d81942-12f4-4f8a-8842-de8e2a878a0a-kube-api-access-bbg8z\") pod \"service-ca-865cb79987-2zqlc\" (UID: \"40d81942-12f4-4f8a-8842-de8e2a878a0a\") " pod="openshift-service-ca/service-ca-865cb79987-2zqlc" Apr 17 14:09:27.922125 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.922078 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:27.922125 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:27.922101 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/40d81942-12f4-4f8a-8842-de8e2a878a0a-signing-cabundle\") pod \"service-ca-865cb79987-2zqlc\" (UID: \"40d81942-12f4-4f8a-8842-de8e2a878a0a\") " pod="openshift-service-ca/service-ca-865cb79987-2zqlc" Apr 17 14:09:27.922252 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:27.922163 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:09:27.922252 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:27.922214 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:43.922199767 +0000 UTC m=+127.558777136 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : secret "router-metrics-certs-default" not found Apr 17 14:09:27.922252 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:27.922228 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle podName:99c53420-8717-4539-b474-bdf6d9f5615a nodeName:}" failed. No retries permitted until 2026-04-17 14:09:43.922220633 +0000 UTC m=+127.558798001 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle") pod "router-default-684d7b5578-8dvb5" (UID: "99c53420-8717-4539-b474-bdf6d9f5615a") : configmap references non-existent config key: service-ca.crt Apr 17 14:09:28.023101 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:28.023058 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/40d81942-12f4-4f8a-8842-de8e2a878a0a-signing-cabundle\") pod \"service-ca-865cb79987-2zqlc\" (UID: \"40d81942-12f4-4f8a-8842-de8e2a878a0a\") " pod="openshift-service-ca/service-ca-865cb79987-2zqlc" Apr 17 14:09:28.023277 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:28.023157 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/40d81942-12f4-4f8a-8842-de8e2a878a0a-signing-key\") pod \"service-ca-865cb79987-2zqlc\" (UID: \"40d81942-12f4-4f8a-8842-de8e2a878a0a\") " pod="openshift-service-ca/service-ca-865cb79987-2zqlc" Apr 17 14:09:28.023277 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:28.023175 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbg8z\" (UniqueName: \"kubernetes.io/projected/40d81942-12f4-4f8a-8842-de8e2a878a0a-kube-api-access-bbg8z\") pod \"service-ca-865cb79987-2zqlc\" (UID: \"40d81942-12f4-4f8a-8842-de8e2a878a0a\") " pod="openshift-service-ca/service-ca-865cb79987-2zqlc" Apr 17 14:09:28.023724 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:28.023689 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/40d81942-12f4-4f8a-8842-de8e2a878a0a-signing-cabundle\") pod \"service-ca-865cb79987-2zqlc\" (UID: \"40d81942-12f4-4f8a-8842-de8e2a878a0a\") " pod="openshift-service-ca/service-ca-865cb79987-2zqlc" Apr 17 14:09:28.025540 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:28.025517 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/40d81942-12f4-4f8a-8842-de8e2a878a0a-signing-key\") pod \"service-ca-865cb79987-2zqlc\" (UID: \"40d81942-12f4-4f8a-8842-de8e2a878a0a\") " pod="openshift-service-ca/service-ca-865cb79987-2zqlc" Apr 17 14:09:28.033897 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:28.033866 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbg8z\" (UniqueName: \"kubernetes.io/projected/40d81942-12f4-4f8a-8842-de8e2a878a0a-kube-api-access-bbg8z\") pod \"service-ca-865cb79987-2zqlc\" (UID: \"40d81942-12f4-4f8a-8842-de8e2a878a0a\") " pod="openshift-service-ca/service-ca-865cb79987-2zqlc" Apr 17 14:09:28.191226 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:28.191194 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2zqlc" Apr 17 14:09:28.300392 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:28.300358 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2zqlc"] Apr 17 14:09:28.304037 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:09:28.304007 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40d81942_12f4_4f8a_8842_de8e2a878a0a.slice/crio-30c21506589f9388a1b28829fd7f4322ad0b5cdeca9ccfe761c1dc0bd3f79670 WatchSource:0}: Error finding container 30c21506589f9388a1b28829fd7f4322ad0b5cdeca9ccfe761c1dc0bd3f79670: Status 404 returned error can't find the container with id 30c21506589f9388a1b28829fd7f4322ad0b5cdeca9ccfe761c1dc0bd3f79670 Apr 17 14:09:29.259765 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:29.259728 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2zqlc" event={"ID":"40d81942-12f4-4f8a-8842-de8e2a878a0a","Type":"ContainerStarted","Data":"30c21506589f9388a1b28829fd7f4322ad0b5cdeca9ccfe761c1dc0bd3f79670"} Apr 17 14:09:30.263415 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:30.263330 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2zqlc" event={"ID":"40d81942-12f4-4f8a-8842-de8e2a878a0a","Type":"ContainerStarted","Data":"8047be743f12d9d6104ab278ab16a4b397f58fc69ff232516f0a1ff3d77ba712"} Apr 17 14:09:30.277004 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:30.276941 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-2zqlc" podStartSLOduration=1.645527094 podStartE2EDuration="3.27692366s" podCreationTimestamp="2026-04-17 14:09:27 +0000 UTC" firstStartedPulling="2026-04-17 14:09:28.305638834 +0000 UTC m=+111.942216203" lastFinishedPulling="2026-04-17 14:09:29.937035401 +0000 UTC m=+113.573612769" observedRunningTime="2026-04-17 14:09:30.27625888 +0000 UTC m=+113.912836271" watchObservedRunningTime="2026-04-17 14:09:30.27692366 +0000 UTC m=+113.913501052" Apr 17 14:09:43.949694 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:43.949636 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:43.950108 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:43.949737 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:43.950789 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:43.950770 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c53420-8717-4539-b474-bdf6d9f5615a-service-ca-bundle\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:43.951986 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:43.951967 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c53420-8717-4539-b474-bdf6d9f5615a-metrics-certs\") pod \"router-default-684d7b5578-8dvb5\" (UID: \"99c53420-8717-4539-b474-bdf6d9f5615a\") " pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:43.986209 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:43.986185 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-4crr7\"" Apr 17 14:09:43.994885 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:43.994866 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:44.104253 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:44.104222 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-684d7b5578-8dvb5"] Apr 17 14:09:44.107342 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:09:44.107311 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99c53420_8717_4539_b474_bdf6d9f5615a.slice/crio-53c06d8be788ff62e491b4bab123b7052f0ed4d34e4cc19598c3ca7642da046b WatchSource:0}: Error finding container 53c06d8be788ff62e491b4bab123b7052f0ed4d34e4cc19598c3ca7642da046b: Status 404 returned error can't find the container with id 53c06d8be788ff62e491b4bab123b7052f0ed4d34e4cc19598c3ca7642da046b Apr 17 14:09:44.293060 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:44.292974 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-684d7b5578-8dvb5" event={"ID":"99c53420-8717-4539-b474-bdf6d9f5615a","Type":"ContainerStarted","Data":"7efd6ebda3483b1bbe3c6ae1e8ee631c098d787d3fb19f56a337815ce78150f7"} Apr 17 14:09:44.293060 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:44.293021 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-684d7b5578-8dvb5" event={"ID":"99c53420-8717-4539-b474-bdf6d9f5615a","Type":"ContainerStarted","Data":"53c06d8be788ff62e491b4bab123b7052f0ed4d34e4cc19598c3ca7642da046b"} Apr 17 14:09:44.313402 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:44.310370 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-684d7b5578-8dvb5" podStartSLOduration=32.310354037 podStartE2EDuration="32.310354037s" podCreationTimestamp="2026-04-17 14:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:09:44.309308934 +0000 UTC m=+127.945886323" watchObservedRunningTime="2026-04-17 14:09:44.310354037 +0000 UTC m=+127.946931428" Apr 17 14:09:44.995180 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:44.995146 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:44.997349 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:44.997323 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:45.296152 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:45.296066 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:45.297269 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:45.297249 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-684d7b5578-8dvb5" Apr 17 14:09:46.670157 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:46.670105 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:09:46.672545 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:46.672519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/497bbc82-edab-4d97-bcc8-7d428e62da1e-metrics-certs\") pod \"network-metrics-daemon-f6d89\" (UID: \"497bbc82-edab-4d97-bcc8-7d428e62da1e\") " pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:09:46.958663 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:46.958588 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bfspf\"" Apr 17 14:09:46.967263 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:46.967246 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6d89" Apr 17 14:09:47.078672 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:47.078635 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f6d89"] Apr 17 14:09:47.081871 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:09:47.081841 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod497bbc82_edab_4d97_bcc8_7d428e62da1e.slice/crio-f6e3418be3335df5ab25ec76b192edcb30ff143c8de0f9c4deb622cae48410bc WatchSource:0}: Error finding container f6e3418be3335df5ab25ec76b192edcb30ff143c8de0f9c4deb622cae48410bc: Status 404 returned error can't find the container with id f6e3418be3335df5ab25ec76b192edcb30ff143c8de0f9c4deb622cae48410bc Apr 17 14:09:47.302333 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:47.302247 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6d89" event={"ID":"497bbc82-edab-4d97-bcc8-7d428e62da1e","Type":"ContainerStarted","Data":"f6e3418be3335df5ab25ec76b192edcb30ff143c8de0f9c4deb622cae48410bc"} Apr 17 14:09:48.306837 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.306795 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6d89" event={"ID":"497bbc82-edab-4d97-bcc8-7d428e62da1e","Type":"ContainerStarted","Data":"e04af1782aaf3760048b8ecc79b8ac526f734bee4e81cb795d363ab98446cee1"} Apr 17 14:09:48.306837 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.306835 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6d89" event={"ID":"497bbc82-edab-4d97-bcc8-7d428e62da1e","Type":"ContainerStarted","Data":"3d83b4340b49c501c0e7a2db2a92e8394aba53b878e4cf4613787621e987d544"} Apr 17 14:09:48.347916 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.347868 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-f6d89" podStartSLOduration=130.429519438 podStartE2EDuration="2m11.34784987s" podCreationTimestamp="2026-04-17 14:07:37 +0000 UTC" firstStartedPulling="2026-04-17 14:09:47.08358767 +0000 UTC m=+130.720165038" lastFinishedPulling="2026-04-17 14:09:48.001918089 +0000 UTC m=+131.638495470" observedRunningTime="2026-04-17 14:09:48.32302598 +0000 UTC m=+131.959603371" watchObservedRunningTime="2026-04-17 14:09:48.34784987 +0000 UTC m=+131.984427312" Apr 17 14:09:48.348129 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.348109 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td"] Apr 17 14:09:48.351048 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.351032 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-jb426"] Apr 17 14:09:48.351176 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.351163 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td" Apr 17 14:09:48.353313 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.353290 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 14:09:48.353464 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.353322 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-97gbr\"" Apr 17 14:09:48.353995 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.353969 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jb426" Apr 17 14:09:48.355943 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.355923 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 14:09:48.356042 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.355968 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 14:09:48.356042 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.356008 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-26szn\"" Apr 17 14:09:48.358123 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.358105 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td"] Apr 17 14:09:48.361219 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.361201 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jb426"] Apr 17 14:09:48.462269 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.462242 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-76bbf49c4d-nw9lh"] Apr 17 14:09:48.465028 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.465012 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.467621 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.467601 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-cs8cq\"" Apr 17 14:09:48.467728 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.467650 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 14:09:48.467789 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.467745 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gtrf5"] Apr 17 14:09:48.467830 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.467798 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 14:09:48.467875 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.467854 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 14:09:48.471177 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.471159 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.474821 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.474804 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-cl66l\"" Apr 17 14:09:48.475621 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.475604 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 14:09:48.477167 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.477148 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 14:09:48.480747 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.480728 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 14:09:48.481029 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.481014 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e7f4ed3e-b7f5-441d-8c8a-22a8c99ecedf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-nh4td\" (UID: \"e7f4ed3e-b7f5-441d-8c8a-22a8c99ecedf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td" Apr 17 14:09:48.481083 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.481049 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmrq6\" (UniqueName: \"kubernetes.io/projected/2303c9e7-fe82-4eab-9edd-fd7c86291690-kube-api-access-kmrq6\") pod \"downloads-6bcc868b7-jb426\" (UID: \"2303c9e7-fe82-4eab-9edd-fd7c86291690\") " pod="openshift-console/downloads-6bcc868b7-jb426" Apr 17 14:09:48.489876 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.489853 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76bbf49c4d-nw9lh"] Apr 17 14:09:48.490591 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.490575 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gtrf5"] Apr 17 14:09:48.581572 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581478 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e7f4ed3e-b7f5-441d-8c8a-22a8c99ecedf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-nh4td\" (UID: \"e7f4ed3e-b7f5-441d-8c8a-22a8c99ecedf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td" Apr 17 14:09:48.581572 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581533 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-bound-sa-token\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.581572 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581557 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgrg8\" (UniqueName: \"kubernetes.io/projected/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-kube-api-access-vgrg8\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.581809 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581575 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-image-registry-private-configuration\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.581809 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581607 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmrq6\" (UniqueName: \"kubernetes.io/projected/2303c9e7-fe82-4eab-9edd-fd7c86291690-kube-api-access-kmrq6\") pod \"downloads-6bcc868b7-jb426\" (UID: \"2303c9e7-fe82-4eab-9edd-fd7c86291690\") " pod="openshift-console/downloads-6bcc868b7-jb426" Apr 17 14:09:48.581809 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581631 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2e8ddfff-998b-421c-9ea9-12e88bb8506b-data-volume\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.581809 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581675 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2e8ddfff-998b-421c-9ea9-12e88bb8506b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.581809 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581778 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-registry-tls\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.581809 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581807 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-ca-trust-extracted\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.582089 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581834 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2e8ddfff-998b-421c-9ea9-12e88bb8506b-crio-socket\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.582089 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581860 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxrzd\" (UniqueName: \"kubernetes.io/projected/2e8ddfff-998b-421c-9ea9-12e88bb8506b-kube-api-access-jxrzd\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.582089 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581880 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2e8ddfff-998b-421c-9ea9-12e88bb8506b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.582089 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581901 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-registry-certificates\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.582089 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.581926 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-trusted-ca\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.582089 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.582000 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-installation-pull-secrets\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.583935 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.583913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e7f4ed3e-b7f5-441d-8c8a-22a8c99ecedf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-nh4td\" (UID: \"e7f4ed3e-b7f5-441d-8c8a-22a8c99ecedf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td" Apr 17 14:09:48.589432 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.589412 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmrq6\" (UniqueName: \"kubernetes.io/projected/2303c9e7-fe82-4eab-9edd-fd7c86291690-kube-api-access-kmrq6\") pod \"downloads-6bcc868b7-jb426\" (UID: \"2303c9e7-fe82-4eab-9edd-fd7c86291690\") " pod="openshift-console/downloads-6bcc868b7-jb426" Apr 17 14:09:48.663140 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.663115 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td" Apr 17 14:09:48.667837 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.667813 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jb426" Apr 17 14:09:48.683396 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683372 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxrzd\" (UniqueName: \"kubernetes.io/projected/2e8ddfff-998b-421c-9ea9-12e88bb8506b-kube-api-access-jxrzd\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.683553 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2e8ddfff-998b-421c-9ea9-12e88bb8506b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.683553 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683426 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-registry-certificates\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.683553 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683452 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-trusted-ca\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.683553 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683496 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-installation-pull-secrets\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.683774 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683566 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-bound-sa-token\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.683774 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683595 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgrg8\" (UniqueName: \"kubernetes.io/projected/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-kube-api-access-vgrg8\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.683774 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683625 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-image-registry-private-configuration\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.683774 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683663 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2e8ddfff-998b-421c-9ea9-12e88bb8506b-data-volume\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.683774 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683713 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2e8ddfff-998b-421c-9ea9-12e88bb8506b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.684023 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683758 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-registry-tls\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.684023 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683896 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-ca-trust-extracted\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.684023 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.683921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2e8ddfff-998b-421c-9ea9-12e88bb8506b-crio-socket\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.684166 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.684022 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2e8ddfff-998b-421c-9ea9-12e88bb8506b-crio-socket\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.684166 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.684067 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2e8ddfff-998b-421c-9ea9-12e88bb8506b-data-volume\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.684911 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.684483 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2e8ddfff-998b-421c-9ea9-12e88bb8506b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.684911 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.684549 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-ca-trust-extracted\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.685452 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.685414 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-trusted-ca\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.685830 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.685774 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-registry-certificates\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.686659 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.686641 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-installation-pull-secrets\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.687264 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.687202 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2e8ddfff-998b-421c-9ea9-12e88bb8506b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.687440 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.687424 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-registry-tls\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.687489 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.687430 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-image-registry-private-configuration\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.691829 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.691806 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-bound-sa-token\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.692466 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.692446 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgrg8\" (UniqueName: \"kubernetes.io/projected/f7c2a2bb-53b7-4cf4-bec4-e515ccc57619-kube-api-access-vgrg8\") pod \"image-registry-76bbf49c4d-nw9lh\" (UID: \"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619\") " pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.692649 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.692635 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxrzd\" (UniqueName: \"kubernetes.io/projected/2e8ddfff-998b-421c-9ea9-12e88bb8506b-kube-api-access-jxrzd\") pod \"insights-runtime-extractor-gtrf5\" (UID: \"2e8ddfff-998b-421c-9ea9-12e88bb8506b\") " pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.774877 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.774848 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:48.781688 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.781263 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gtrf5" Apr 17 14:09:48.781875 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.781832 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td"] Apr 17 14:09:48.788367 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:09:48.788339 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7f4ed3e_b7f5_441d_8c8a_22a8c99ecedf.slice/crio-c5097167c81dc8cc108c1605e68cd5d8236f0a16fe0804d44e6526bbdad56e01 WatchSource:0}: Error finding container c5097167c81dc8cc108c1605e68cd5d8236f0a16fe0804d44e6526bbdad56e01: Status 404 returned error can't find the container with id c5097167c81dc8cc108c1605e68cd5d8236f0a16fe0804d44e6526bbdad56e01 Apr 17 14:09:48.806438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.806412 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jb426"] Apr 17 14:09:48.810010 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:09:48.809983 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2303c9e7_fe82_4eab_9edd_fd7c86291690.slice/crio-31f9d6f9bae926d0c70084b6e6f249d6183aaf651c65694ff6fdbc767f6906ec WatchSource:0}: Error finding container 31f9d6f9bae926d0c70084b6e6f249d6183aaf651c65694ff6fdbc767f6906ec: Status 404 returned error can't find the container with id 31f9d6f9bae926d0c70084b6e6f249d6183aaf651c65694ff6fdbc767f6906ec Apr 17 14:09:48.902578 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.902496 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76bbf49c4d-nw9lh"] Apr 17 14:09:48.904970 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:09:48.904943 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7c2a2bb_53b7_4cf4_bec4_e515ccc57619.slice/crio-4164b250c378c705a1f95e92894e85019112c28299a531a5aa522e17ca6cf105 WatchSource:0}: Error finding container 4164b250c378c705a1f95e92894e85019112c28299a531a5aa522e17ca6cf105: Status 404 returned error can't find the container with id 4164b250c378c705a1f95e92894e85019112c28299a531a5aa522e17ca6cf105 Apr 17 14:09:48.919866 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:48.919841 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gtrf5"] Apr 17 14:09:48.922647 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:09:48.922623 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e8ddfff_998b_421c_9ea9_12e88bb8506b.slice/crio-61e35bec391ed62528467a563067158b8d261c5a7f90e0500526ffc271ac494e WatchSource:0}: Error finding container 61e35bec391ed62528467a563067158b8d261c5a7f90e0500526ffc271ac494e: Status 404 returned error can't find the container with id 61e35bec391ed62528467a563067158b8d261c5a7f90e0500526ffc271ac494e Apr 17 14:09:49.312132 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:49.312090 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gtrf5" event={"ID":"2e8ddfff-998b-421c-9ea9-12e88bb8506b","Type":"ContainerStarted","Data":"268b19469e5667be8f49ac2829b400b160f71c12abc067d6b328ddcb6d7fdf32"} Apr 17 14:09:49.312549 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:49.312142 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gtrf5" event={"ID":"2e8ddfff-998b-421c-9ea9-12e88bb8506b","Type":"ContainerStarted","Data":"61e35bec391ed62528467a563067158b8d261c5a7f90e0500526ffc271ac494e"} Apr 17 14:09:49.313698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:49.313645 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" event={"ID":"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619","Type":"ContainerStarted","Data":"848fe789c1c3980581b42f0295532c65ad3c620c1e4d8708b20784592f8c8935"} Apr 17 14:09:49.313698 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:49.313688 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" event={"ID":"f7c2a2bb-53b7-4cf4-bec4-e515ccc57619","Type":"ContainerStarted","Data":"4164b250c378c705a1f95e92894e85019112c28299a531a5aa522e17ca6cf105"} Apr 17 14:09:49.313885 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:49.313788 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:09:49.315429 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:49.315400 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jb426" event={"ID":"2303c9e7-fe82-4eab-9edd-fd7c86291690","Type":"ContainerStarted","Data":"31f9d6f9bae926d0c70084b6e6f249d6183aaf651c65694ff6fdbc767f6906ec"} Apr 17 14:09:49.317102 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:49.317076 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td" event={"ID":"e7f4ed3e-b7f5-441d-8c8a-22a8c99ecedf","Type":"ContainerStarted","Data":"c5097167c81dc8cc108c1605e68cd5d8236f0a16fe0804d44e6526bbdad56e01"} Apr 17 14:09:49.331257 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:49.331167 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" podStartSLOduration=1.331149705 podStartE2EDuration="1.331149705s" podCreationTimestamp="2026-04-17 14:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:09:49.330437601 +0000 UTC m=+132.967014992" watchObservedRunningTime="2026-04-17 14:09:49.331149705 +0000 UTC m=+132.967727096" Apr 17 14:09:50.321645 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:50.321548 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td" event={"ID":"e7f4ed3e-b7f5-441d-8c8a-22a8c99ecedf","Type":"ContainerStarted","Data":"7ef89efcd5f655d80725a915ff7c5aae30e87aaa7b4009a1a40db4f17b0c8461"} Apr 17 14:09:50.322097 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:50.321744 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td" Apr 17 14:09:50.323910 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:50.323882 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gtrf5" event={"ID":"2e8ddfff-998b-421c-9ea9-12e88bb8506b","Type":"ContainerStarted","Data":"16d733a85e83bd2663d3651371b02dd76b5a32dd8388136542047784df3a3cca"} Apr 17 14:09:50.327188 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:50.327162 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td" Apr 17 14:09:50.350104 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:50.350050 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nh4td" podStartSLOduration=1.191037156 podStartE2EDuration="2.350033129s" podCreationTimestamp="2026-04-17 14:09:48 +0000 UTC" firstStartedPulling="2026-04-17 14:09:48.790760137 +0000 UTC m=+132.427337520" lastFinishedPulling="2026-04-17 14:09:49.949756111 +0000 UTC m=+133.586333493" observedRunningTime="2026-04-17 14:09:50.335821789 +0000 UTC m=+133.972399179" watchObservedRunningTime="2026-04-17 14:09:50.350033129 +0000 UTC m=+133.986610520" Apr 17 14:09:53.333790 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:53.333757 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gtrf5" event={"ID":"2e8ddfff-998b-421c-9ea9-12e88bb8506b","Type":"ContainerStarted","Data":"e970c15f3337aa9057fb016c897e1348e22a69885b0dfffd70f406fed15718b2"} Apr 17 14:09:53.349794 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:53.349739 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gtrf5" podStartSLOduration=2.065780132 podStartE2EDuration="5.349721369s" podCreationTimestamp="2026-04-17 14:09:48 +0000 UTC" firstStartedPulling="2026-04-17 14:09:48.979443995 +0000 UTC m=+132.616021367" lastFinishedPulling="2026-04-17 14:09:52.263385237 +0000 UTC m=+135.899962604" observedRunningTime="2026-04-17 14:09:53.348129708 +0000 UTC m=+136.984707098" watchObservedRunningTime="2026-04-17 14:09:53.349721369 +0000 UTC m=+136.986298757" Apr 17 14:09:56.278891 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.278857 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c67f685cd-pxwq5"] Apr 17 14:09:56.283722 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.283695 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.287018 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.286974 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 14:09:56.287230 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.287214 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 14:09:56.288147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.288122 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gwdpx\"" Apr 17 14:09:56.288265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.288133 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 14:09:56.288265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.288251 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 14:09:56.288387 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.288364 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 14:09:56.290497 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.290472 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c67f685cd-pxwq5"] Apr 17 14:09:56.293600 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.293578 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 14:09:56.450111 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.450072 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-oauth-config\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.450282 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.450137 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-service-ca\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.450282 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.450183 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-oauth-serving-cert\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.450282 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.450220 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-serving-cert\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.450282 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.450275 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-console-config\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.450452 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.450309 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kchsk\" (UniqueName: \"kubernetes.io/projected/39bceda9-a654-4e42-97bd-0e49222f2307-kube-api-access-kchsk\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.450452 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.450341 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-trusted-ca-bundle\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.551423 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.551347 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kchsk\" (UniqueName: \"kubernetes.io/projected/39bceda9-a654-4e42-97bd-0e49222f2307-kube-api-access-kchsk\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.551423 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.551400 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-trusted-ca-bundle\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.551653 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.551473 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-oauth-config\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.551653 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.551528 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-service-ca\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.551653 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.551572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-oauth-serving-cert\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.551653 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.551611 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-serving-cert\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.551653 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.551636 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-console-config\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.552331 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.552275 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-oauth-serving-cert\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.552458 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.552373 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-console-config\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.552458 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.552422 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-service-ca\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.552645 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.552523 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-trusted-ca-bundle\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.554175 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.554155 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-oauth-config\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.554790 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.554765 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-serving-cert\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.559380 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.559359 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kchsk\" (UniqueName: \"kubernetes.io/projected/39bceda9-a654-4e42-97bd-0e49222f2307-kube-api-access-kchsk\") pod \"console-5c67f685cd-pxwq5\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.595053 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.595024 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:09:56.723498 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:56.723465 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c67f685cd-pxwq5"] Apr 17 14:09:56.726659 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:09:56.726621 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39bceda9_a654_4e42_97bd_0e49222f2307.slice/crio-483adec973ac8e4489ba0e02d8ec3523ce6d06cc5487c927381b3a6392ded084 WatchSource:0}: Error finding container 483adec973ac8e4489ba0e02d8ec3523ce6d06cc5487c927381b3a6392ded084: Status 404 returned error can't find the container with id 483adec973ac8e4489ba0e02d8ec3523ce6d06cc5487c927381b3a6392ded084 Apr 17 14:09:57.345374 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.345334 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c67f685cd-pxwq5" event={"ID":"39bceda9-a654-4e42-97bd-0e49222f2307","Type":"ContainerStarted","Data":"483adec973ac8e4489ba0e02d8ec3523ce6d06cc5487c927381b3a6392ded084"} Apr 17 14:09:57.773891 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.773676 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pfd9b"] Apr 17 14:09:57.776188 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.776167 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.778902 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.778863 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 14:09:57.781274 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.779867 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-snrdv\"" Apr 17 14:09:57.781274 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.780083 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 14:09:57.781274 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.780277 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 14:09:57.781274 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.780468 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 14:09:57.781274 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.780646 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 14:09:57.781274 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.780846 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 14:09:57.793269 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.793247 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pfd9b"] Apr 17 14:09:57.799113 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.797764 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hmsjj"] Apr 17 14:09:57.800192 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.800175 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:57.803555 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.803537 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 14:09:57.803778 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.803754 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-t554z\"" Apr 17 14:09:57.803867 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.803842 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 14:09:57.804062 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.803642 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 14:09:57.864976 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.864922 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/51b1465e-8a5a-410d-8ece-f7d239a13616-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.865155 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.865020 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.865155 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.865067 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.865155 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.865108 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51b1465e-8a5a-410d-8ece-f7d239a13616-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.865155 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.865146 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjjvb\" (UniqueName: \"kubernetes.io/projected/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-api-access-qjjvb\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.865356 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.865172 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.966836 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.966803 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/51b1465e-8a5a-410d-8ece-f7d239a13616-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.967009 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.966865 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-wtmp\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:57.967009 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.966899 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.967009 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.966931 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9cb2\" (UniqueName: \"kubernetes.io/projected/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-kube-api-access-x9cb2\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:57.967009 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.966955 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-sys\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:57.967009 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.966987 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.967267 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.967013 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:57.967267 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.967049 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-textfile\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:57.967267 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.967075 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-root\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:57.967267 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.967100 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-metrics-client-ca\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:57.967267 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.967130 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51b1465e-8a5a-410d-8ece-f7d239a13616-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.967267 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.967173 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjjvb\" (UniqueName: \"kubernetes.io/projected/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-api-access-qjjvb\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.967267 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.967201 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-tls\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:57.967267 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.967229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.967661 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.967267 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-accelerators-collector-config\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:57.967779 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.967747 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/51b1465e-8a5a-410d-8ece-f7d239a13616-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.968108 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:57.968087 2568 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 14:09:57.968180 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:09:57.968155 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-tls podName:51b1465e-8a5a-410d-8ece-f7d239a13616 nodeName:}" failed. No retries permitted until 2026-04-17 14:09:58.468136254 +0000 UTC m=+142.104713635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-pfd9b" (UID: "51b1465e-8a5a-410d-8ece-f7d239a13616") : secret "kube-state-metrics-tls" not found Apr 17 14:09:57.968841 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.968437 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.969281 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.969233 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51b1465e-8a5a-410d-8ece-f7d239a13616-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.971356 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.971308 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:57.979337 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:57.979297 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjjvb\" (UniqueName: \"kubernetes.io/projected/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-api-access-qjjvb\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:58.068970 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.068154 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9cb2\" (UniqueName: \"kubernetes.io/projected/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-kube-api-access-x9cb2\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.068970 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.068199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-sys\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.068970 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.068323 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.068970 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.068365 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-textfile\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.068970 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.068391 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-root\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.068970 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.068415 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-metrics-client-ca\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.068970 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.068460 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-tls\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.068970 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.068498 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-accelerators-collector-config\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.068970 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.068595 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-wtmp\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.068970 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.068770 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-wtmp\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.068970 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.068819 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-root\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.069684 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.069111 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-sys\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.070324 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.070200 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-metrics-client-ca\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.070324 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.070209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-accelerators-collector-config\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.070581 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.070458 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-textfile\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.071916 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.071844 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.072027 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.071989 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-node-exporter-tls\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.076483 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.076441 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9cb2\" (UniqueName: \"kubernetes.io/projected/af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00-kube-api-access-x9cb2\") pod \"node-exporter-hmsjj\" (UID: \"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00\") " pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.118677 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.118646 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hmsjj" Apr 17 14:09:58.349973 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.349875 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmsjj" event={"ID":"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00","Type":"ContainerStarted","Data":"c2860d366964e1d3fab5b96db9697d97af8645292585c69e089a422582cb7fa5"} Apr 17 14:09:58.472410 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.472313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:58.475171 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.475134 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/51b1465e-8a5a-410d-8ece-f7d239a13616-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pfd9b\" (UID: \"51b1465e-8a5a-410d-8ece-f7d239a13616\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:09:58.696684 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:09:58.696647 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" Apr 17 14:10:02.969933 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:02.969902 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-58d99675d9-h77tc"] Apr 17 14:10:02.974676 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:02.974656 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:02.977746 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:02.977624 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 14:10:02.977905 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:02.977800 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-9vqgr\"" Apr 17 14:10:02.977905 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:02.977882 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 14:10:02.977905 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:02.977886 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 14:10:02.978278 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:02.978256 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 14:10:02.978382 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:02.978293 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 14:10:02.984714 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:02.984690 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-58d99675d9-h77tc"] Apr 17 14:10:02.985208 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:02.985187 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 14:10:03.011484 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.011454 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xk9n\" (UniqueName: \"kubernetes.io/projected/ac42def7-0611-4dfc-9d27-f63a377b1901-kube-api-access-5xk9n\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.011484 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.011496 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-secret-telemeter-client\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.011738 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.011599 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-federate-client-tls\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.011738 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.011645 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac42def7-0611-4dfc-9d27-f63a377b1901-serving-certs-ca-bundle\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.011738 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.011731 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-telemeter-client-tls\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.011874 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.011772 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac42def7-0611-4dfc-9d27-f63a377b1901-telemeter-trusted-ca-bundle\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.011874 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.011795 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac42def7-0611-4dfc-9d27-f63a377b1901-metrics-client-ca\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.011874 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.011852 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.112741 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.112710 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-telemeter-client-tls\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.112921 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.112754 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac42def7-0611-4dfc-9d27-f63a377b1901-telemeter-trusted-ca-bundle\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.112921 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.112778 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac42def7-0611-4dfc-9d27-f63a377b1901-metrics-client-ca\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.112921 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.112827 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.112921 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.112862 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xk9n\" (UniqueName: \"kubernetes.io/projected/ac42def7-0611-4dfc-9d27-f63a377b1901-kube-api-access-5xk9n\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.112921 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.112896 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-secret-telemeter-client\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.113189 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.112951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-federate-client-tls\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.113189 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.112988 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac42def7-0611-4dfc-9d27-f63a377b1901-serving-certs-ca-bundle\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.113716 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.113654 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac42def7-0611-4dfc-9d27-f63a377b1901-metrics-client-ca\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.113716 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.113681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac42def7-0611-4dfc-9d27-f63a377b1901-serving-certs-ca-bundle\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.113902 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.113760 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac42def7-0611-4dfc-9d27-f63a377b1901-telemeter-trusted-ca-bundle\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.115814 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.115771 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-telemeter-client-tls\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.115939 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.115914 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.116010 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.115949 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-federate-client-tls\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.116156 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.116130 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ac42def7-0611-4dfc-9d27-f63a377b1901-secret-telemeter-client\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.120520 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.120481 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xk9n\" (UniqueName: \"kubernetes.io/projected/ac42def7-0611-4dfc-9d27-f63a377b1901-kube-api-access-5xk9n\") pod \"telemeter-client-58d99675d9-h77tc\" (UID: \"ac42def7-0611-4dfc-9d27-f63a377b1901\") " pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:03.287085 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:03.286998 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" Apr 17 14:10:04.123667 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.123632 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55c8fddf87-ppph7"] Apr 17 14:10:04.128471 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.128450 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.135149 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.135123 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55c8fddf87-ppph7"] Apr 17 14:10:04.222879 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.222845 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-oauth-config\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.223088 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.222905 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-trusted-ca-bundle\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.223088 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.222941 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgnmw\" (UniqueName: \"kubernetes.io/projected/9d5e0603-e15e-4533-a20e-826510c68901-kube-api-access-rgnmw\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.223088 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.222975 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-oauth-serving-cert\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.223088 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.222999 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-console-config\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.223088 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.223081 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-serving-cert\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.223344 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.223192 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-service-ca\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.324480 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.324442 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-trusted-ca-bundle\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.324480 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.324486 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgnmw\" (UniqueName: \"kubernetes.io/projected/9d5e0603-e15e-4533-a20e-826510c68901-kube-api-access-rgnmw\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.324752 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.324545 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-oauth-serving-cert\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.324752 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.324567 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-console-config\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.324752 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.324605 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-serving-cert\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.324752 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.324684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-service-ca\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.324752 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.324751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-oauth-config\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.325391 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.325336 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-oauth-serving-cert\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.325547 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.325432 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-trusted-ca-bundle\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.325750 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.325728 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-console-config\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.325837 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.325817 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-service-ca\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.327395 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.327375 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-oauth-config\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.327603 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.327582 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-serving-cert\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.331734 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.331714 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgnmw\" (UniqueName: \"kubernetes.io/projected/9d5e0603-e15e-4533-a20e-826510c68901-kube-api-access-rgnmw\") pod \"console-55c8fddf87-ppph7\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:04.440566 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:04.440495 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:06.924420 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:06.924375 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-58d99675d9-h77tc"] Apr 17 14:10:06.927084 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:10:06.927035 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac42def7_0611_4dfc_9d27_f63a377b1901.slice/crio-f331515723c0f99d7f508d35dd911cf1068a0153929bb7ca14b4f82f99f25d32 WatchSource:0}: Error finding container f331515723c0f99d7f508d35dd911cf1068a0153929bb7ca14b4f82f99f25d32: Status 404 returned error can't find the container with id f331515723c0f99d7f508d35dd911cf1068a0153929bb7ca14b4f82f99f25d32 Apr 17 14:10:06.934022 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:06.933910 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55c8fddf87-ppph7"] Apr 17 14:10:06.957755 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:06.957474 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pfd9b"] Apr 17 14:10:06.959730 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:10:06.959700 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b1465e_8a5a_410d_8ece_f7d239a13616.slice/crio-24f815bd12537e88e737d1c96b4146168f74c478f7c819735ea7ec4d0b3d8b64 WatchSource:0}: Error finding container 24f815bd12537e88e737d1c96b4146168f74c478f7c819735ea7ec4d0b3d8b64: Status 404 returned error can't find the container with id 24f815bd12537e88e737d1c96b4146168f74c478f7c819735ea7ec4d0b3d8b64 Apr 17 14:10:07.378822 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.378727 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c8fddf87-ppph7" event={"ID":"9d5e0603-e15e-4533-a20e-826510c68901","Type":"ContainerStarted","Data":"eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc"} Apr 17 14:10:07.378822 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.378781 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c8fddf87-ppph7" event={"ID":"9d5e0603-e15e-4533-a20e-826510c68901","Type":"ContainerStarted","Data":"0f24a19def597c91ebf47414abcd3e5181430cd9625012c2f5959570d228497e"} Apr 17 14:10:07.380397 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.380369 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jb426" event={"ID":"2303c9e7-fe82-4eab-9edd-fd7c86291690","Type":"ContainerStarted","Data":"7a020ca9c4b38b73991ee8eef6471306136bdb341f13d0b576b0ade38e9347d5"} Apr 17 14:10:07.380653 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.380628 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-jb426" Apr 17 14:10:07.382215 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.382187 2568 generic.go:358] "Generic (PLEG): container finished" podID="af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00" containerID="71cced492a46bf00157b7604b4df1fc4294d0b4e139fc5db697994c799f03c1f" exitCode=0 Apr 17 14:10:07.382494 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.382275 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmsjj" event={"ID":"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00","Type":"ContainerDied","Data":"71cced492a46bf00157b7604b4df1fc4294d0b4e139fc5db697994c799f03c1f"} Apr 17 14:10:07.383785 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.383761 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" event={"ID":"ac42def7-0611-4dfc-9d27-f63a377b1901","Type":"ContainerStarted","Data":"f331515723c0f99d7f508d35dd911cf1068a0153929bb7ca14b4f82f99f25d32"} Apr 17 14:10:07.384888 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.384864 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" event={"ID":"51b1465e-8a5a-410d-8ece-f7d239a13616","Type":"ContainerStarted","Data":"24f815bd12537e88e737d1c96b4146168f74c478f7c819735ea7ec4d0b3d8b64"} Apr 17 14:10:07.386346 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.386319 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c67f685cd-pxwq5" event={"ID":"39bceda9-a654-4e42-97bd-0e49222f2307","Type":"ContainerStarted","Data":"7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749"} Apr 17 14:10:07.388847 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.388830 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-jb426" Apr 17 14:10:07.396141 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.396096 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55c8fddf87-ppph7" podStartSLOduration=3.396080371 podStartE2EDuration="3.396080371s" podCreationTimestamp="2026-04-17 14:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:10:07.394968309 +0000 UTC m=+151.031545735" watchObservedRunningTime="2026-04-17 14:10:07.396080371 +0000 UTC m=+151.032657762" Apr 17 14:10:07.409423 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.409375 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-jb426" podStartSLOduration=1.40224174 podStartE2EDuration="19.409361813s" podCreationTimestamp="2026-04-17 14:09:48 +0000 UTC" firstStartedPulling="2026-04-17 14:09:48.812312732 +0000 UTC m=+132.448890100" lastFinishedPulling="2026-04-17 14:10:06.819432801 +0000 UTC m=+150.456010173" observedRunningTime="2026-04-17 14:10:07.408730338 +0000 UTC m=+151.045307724" watchObservedRunningTime="2026-04-17 14:10:07.409361813 +0000 UTC m=+151.045939203" Apr 17 14:10:07.425877 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:07.425823 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c67f685cd-pxwq5" podStartSLOduration=1.394046026 podStartE2EDuration="11.425803048s" podCreationTimestamp="2026-04-17 14:09:56 +0000 UTC" firstStartedPulling="2026-04-17 14:09:56.729021642 +0000 UTC m=+140.365599013" lastFinishedPulling="2026-04-17 14:10:06.760778653 +0000 UTC m=+150.397356035" observedRunningTime="2026-04-17 14:10:07.423744055 +0000 UTC m=+151.060321480" watchObservedRunningTime="2026-04-17 14:10:07.425803048 +0000 UTC m=+151.062380438" Apr 17 14:10:08.392996 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:08.392949 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmsjj" event={"ID":"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00","Type":"ContainerStarted","Data":"3044c58c6d3f1f0870fae9cc71580c8068e60cf60cd114adef329b766eaeac33"} Apr 17 14:10:08.393435 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:08.393008 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmsjj" event={"ID":"af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00","Type":"ContainerStarted","Data":"18a646b626284a329d90c5158fec4546ad8c0f7a6a6244b201ac8b765f2c7761"} Apr 17 14:10:08.411725 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:08.411665 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hmsjj" podStartSLOduration=2.784269386 podStartE2EDuration="11.411646808s" podCreationTimestamp="2026-04-17 14:09:57 +0000 UTC" firstStartedPulling="2026-04-17 14:09:58.133242896 +0000 UTC m=+141.769820269" lastFinishedPulling="2026-04-17 14:10:06.760620323 +0000 UTC m=+150.397197691" observedRunningTime="2026-04-17 14:10:08.409686868 +0000 UTC m=+152.046264259" watchObservedRunningTime="2026-04-17 14:10:08.411646808 +0000 UTC m=+152.048224199" Apr 17 14:10:08.781884 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:08.781845 2568 patch_prober.go:28] interesting pod/image-registry-76bbf49c4d-nw9lh container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 14:10:08.782059 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:08.781911 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" podUID="f7c2a2bb-53b7-4cf4-bec4-e515ccc57619" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 14:10:09.398086 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:09.398044 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" event={"ID":"51b1465e-8a5a-410d-8ece-f7d239a13616","Type":"ContainerStarted","Data":"ed47450ed55f28518bc69e88c3c0030b69cca33cb339234f94623ff95934d695"} Apr 17 14:10:09.398086 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:09.398089 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" event={"ID":"51b1465e-8a5a-410d-8ece-f7d239a13616","Type":"ContainerStarted","Data":"7fc4a2d0d142f5b34e86131f82694dfccae766b64a6fc44b14fa06af1bde3d76"} Apr 17 14:10:09.398641 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:09.398107 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" event={"ID":"51b1465e-8a5a-410d-8ece-f7d239a13616","Type":"ContainerStarted","Data":"e78347303ed231583bd18dbb75fa09220f0eb0cea92db4aca413d9161da1c007"} Apr 17 14:10:09.414561 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:09.414484 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-pfd9b" podStartSLOduration=10.936896838 podStartE2EDuration="12.414467018s" podCreationTimestamp="2026-04-17 14:09:57 +0000 UTC" firstStartedPulling="2026-04-17 14:10:06.961996802 +0000 UTC m=+150.598574173" lastFinishedPulling="2026-04-17 14:10:08.439566986 +0000 UTC m=+152.076144353" observedRunningTime="2026-04-17 14:10:09.413736959 +0000 UTC m=+153.050314353" watchObservedRunningTime="2026-04-17 14:10:09.414467018 +0000 UTC m=+153.051044410" Apr 17 14:10:10.329586 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:10.329543 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-76bbf49c4d-nw9lh" Apr 17 14:10:10.404485 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:10.404438 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" event={"ID":"ac42def7-0611-4dfc-9d27-f63a377b1901","Type":"ContainerStarted","Data":"90f1493cd5238686c627a6da9a9ba2be59680d66ca17f2b52fbf438791520571"} Apr 17 14:10:12.412485 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:12.412447 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" event={"ID":"ac42def7-0611-4dfc-9d27-f63a377b1901","Type":"ContainerStarted","Data":"609f5f21ae75b51af6ee38a69a693ab048f13ac10956f0c686a230ae87ccc449"} Apr 17 14:10:12.412485 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:12.412491 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" event={"ID":"ac42def7-0611-4dfc-9d27-f63a377b1901","Type":"ContainerStarted","Data":"5b11e8dd76c22ca9694e90b96a2ec20d546b9050f996283811a6978d949473b1"} Apr 17 14:10:12.433630 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:12.433554 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-58d99675d9-h77tc" podStartSLOduration=5.839635663 podStartE2EDuration="10.433532215s" podCreationTimestamp="2026-04-17 14:10:02 +0000 UTC" firstStartedPulling="2026-04-17 14:10:06.928766202 +0000 UTC m=+150.565343571" lastFinishedPulling="2026-04-17 14:10:11.522662741 +0000 UTC m=+155.159240123" observedRunningTime="2026-04-17 14:10:12.431944913 +0000 UTC m=+156.068522302" watchObservedRunningTime="2026-04-17 14:10:12.433532215 +0000 UTC m=+156.070109605" Apr 17 14:10:12.805040 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:10:12.804938 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-jscrk" podUID="2742c3d7-a7f7-4525-b1be-30a78e5cec2f" Apr 17 14:10:12.822229 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:10:12.822192 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qf25s" podUID="b7c15494-fd82-415f-967e-b8bf2220ef8a" Apr 17 14:10:13.206866 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.206830 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c67f685cd-pxwq5"] Apr 17 14:10:13.239885 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.239851 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56f6f67987-66d9d"] Apr 17 14:10:13.261814 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.261785 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56f6f67987-66d9d"] Apr 17 14:10:13.261991 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.261932 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.319101 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.319067 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-trusted-ca-bundle\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.319101 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.319104 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-oauth-serving-cert\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.319388 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.319128 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-console-config\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.319388 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.319260 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72pwv\" (UniqueName: \"kubernetes.io/projected/00100943-311e-435f-ada2-e95dae3bf92f-kube-api-access-72pwv\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.319388 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.319312 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-serving-cert\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.319576 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.319398 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-oauth-config\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.319576 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.319423 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-service-ca\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.420553 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.420492 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72pwv\" (UniqueName: \"kubernetes.io/projected/00100943-311e-435f-ada2-e95dae3bf92f-kube-api-access-72pwv\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.421065 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.420572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-serving-cert\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.421065 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.420623 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-oauth-config\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.421065 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.420646 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-service-ca\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.421065 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.420703 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jscrk" Apr 17 14:10:13.421065 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.420825 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-trusted-ca-bundle\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.421065 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.420870 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-oauth-serving-cert\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.421065 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.420908 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-console-config\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.421499 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.421449 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-service-ca\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.421664 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.421617 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-oauth-serving-cert\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.421741 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.421681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-console-config\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.421795 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.421775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-trusted-ca-bundle\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.423564 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.423528 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-serving-cert\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.423733 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.423715 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-oauth-config\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.428836 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.428812 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72pwv\" (UniqueName: \"kubernetes.io/projected/00100943-311e-435f-ada2-e95dae3bf92f-kube-api-access-72pwv\") pod \"console-56f6f67987-66d9d\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.573793 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.573750 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:13.708828 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:13.708756 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56f6f67987-66d9d"] Apr 17 14:10:13.712319 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:10:13.712283 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00100943_311e_435f_ada2_e95dae3bf92f.slice/crio-b4feae1d3943d7e570033aeeca0d9591a114f8aaaf3aaad7367806fea122284d WatchSource:0}: Error finding container b4feae1d3943d7e570033aeeca0d9591a114f8aaaf3aaad7367806fea122284d: Status 404 returned error can't find the container with id b4feae1d3943d7e570033aeeca0d9591a114f8aaaf3aaad7367806fea122284d Apr 17 14:10:14.425978 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:14.425936 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56f6f67987-66d9d" event={"ID":"00100943-311e-435f-ada2-e95dae3bf92f","Type":"ContainerStarted","Data":"105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124"} Apr 17 14:10:14.425978 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:14.425982 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56f6f67987-66d9d" event={"ID":"00100943-311e-435f-ada2-e95dae3bf92f","Type":"ContainerStarted","Data":"b4feae1d3943d7e570033aeeca0d9591a114f8aaaf3aaad7367806fea122284d"} Apr 17 14:10:14.440765 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:14.440742 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:14.440990 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:14.440925 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:14.442454 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:14.442403 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56f6f67987-66d9d" podStartSLOduration=1.442386371 podStartE2EDuration="1.442386371s" podCreationTimestamp="2026-04-17 14:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:10:14.440876095 +0000 UTC m=+158.077453508" watchObservedRunningTime="2026-04-17 14:10:14.442386371 +0000 UTC m=+158.078963762" Apr 17 14:10:14.446482 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:14.446458 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:15.433764 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:15.433681 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:16.595736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:16.595709 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:10:17.664215 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:17.664172 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:10:17.664637 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:17.664229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:10:17.666684 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:17.666648 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2742c3d7-a7f7-4525-b1be-30a78e5cec2f-metrics-tls\") pod \"dns-default-jscrk\" (UID: \"2742c3d7-a7f7-4525-b1be-30a78e5cec2f\") " pod="openshift-dns/dns-default-jscrk" Apr 17 14:10:17.666795 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:17.666718 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c15494-fd82-415f-967e-b8bf2220ef8a-cert\") pod \"ingress-canary-qf25s\" (UID: \"b7c15494-fd82-415f-967e-b8bf2220ef8a\") " pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:10:17.923748 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:17.923676 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fnmsv\"" Apr 17 14:10:17.931818 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:17.931766 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jscrk" Apr 17 14:10:18.062050 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:18.062019 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jscrk"] Apr 17 14:10:18.065333 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:10:18.065302 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2742c3d7_a7f7_4525_b1be_30a78e5cec2f.slice/crio-aaf5c3aefbf6396dcac590069e8c49654ca42ea3536cc187213bf439c98ad288 WatchSource:0}: Error finding container aaf5c3aefbf6396dcac590069e8c49654ca42ea3536cc187213bf439c98ad288: Status 404 returned error can't find the container with id aaf5c3aefbf6396dcac590069e8c49654ca42ea3536cc187213bf439c98ad288 Apr 17 14:10:18.438324 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:18.438285 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jscrk" event={"ID":"2742c3d7-a7f7-4525-b1be-30a78e5cec2f","Type":"ContainerStarted","Data":"aaf5c3aefbf6396dcac590069e8c49654ca42ea3536cc187213bf439c98ad288"} Apr 17 14:10:20.448440 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:20.448404 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jscrk" event={"ID":"2742c3d7-a7f7-4525-b1be-30a78e5cec2f","Type":"ContainerStarted","Data":"d065bc5cb842458ee81faf14444b5d48ea75cfa22ce4c0143d1048694d109929"} Apr 17 14:10:20.448878 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:20.448446 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jscrk" event={"ID":"2742c3d7-a7f7-4525-b1be-30a78e5cec2f","Type":"ContainerStarted","Data":"6c65ac3b9a9d2460c18efc583aefe96e141c20d7b24a21bb30320954ff43c617"} Apr 17 14:10:20.448878 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:20.448478 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jscrk" Apr 17 14:10:20.465481 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:20.465437 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jscrk" podStartSLOduration=129.573304152 podStartE2EDuration="2m11.465422387s" podCreationTimestamp="2026-04-17 14:08:09 +0000 UTC" firstStartedPulling="2026-04-17 14:10:18.06718944 +0000 UTC m=+161.703766808" lastFinishedPulling="2026-04-17 14:10:19.95930767 +0000 UTC m=+163.595885043" observedRunningTime="2026-04-17 14:10:20.464181268 +0000 UTC m=+164.100758659" watchObservedRunningTime="2026-04-17 14:10:20.465422387 +0000 UTC m=+164.101999777" Apr 17 14:10:23.574729 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:23.574691 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:23.575190 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:23.574752 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:23.579367 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:23.579346 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:24.465434 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:24.465409 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:10:24.513096 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:24.513063 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55c8fddf87-ppph7"] Apr 17 14:10:24.945655 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:24.945628 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:10:24.947859 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:24.947842 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nsnr8\"" Apr 17 14:10:24.956539 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:24.956524 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qf25s" Apr 17 14:10:25.086478 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:25.086451 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qf25s"] Apr 17 14:10:25.089882 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:10:25.089857 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7c15494_fd82_415f_967e_b8bf2220ef8a.slice/crio-e774d4b334ce5ebba6acadf5f2a901355887289013540cc02ecaf816d6665d23 WatchSource:0}: Error finding container e774d4b334ce5ebba6acadf5f2a901355887289013540cc02ecaf816d6665d23: Status 404 returned error can't find the container with id e774d4b334ce5ebba6acadf5f2a901355887289013540cc02ecaf816d6665d23 Apr 17 14:10:25.465871 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:25.465840 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qf25s" event={"ID":"b7c15494-fd82-415f-967e-b8bf2220ef8a","Type":"ContainerStarted","Data":"e774d4b334ce5ebba6acadf5f2a901355887289013540cc02ecaf816d6665d23"} Apr 17 14:10:28.475590 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:28.475552 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qf25s" event={"ID":"b7c15494-fd82-415f-967e-b8bf2220ef8a","Type":"ContainerStarted","Data":"0519fc3678a7293ab0d561dba9f664a9fef33c157d259eb09b3ad963f6b01b74"} Apr 17 14:10:28.490428 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:28.490378 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qf25s" podStartSLOduration=137.054123647 podStartE2EDuration="2m19.490362818s" podCreationTimestamp="2026-04-17 14:08:09 +0000 UTC" firstStartedPulling="2026-04-17 14:10:25.091741794 +0000 UTC m=+168.728319164" lastFinishedPulling="2026-04-17 14:10:27.527980955 +0000 UTC m=+171.164558335" observedRunningTime="2026-04-17 14:10:28.488452363 +0000 UTC m=+172.125029753" watchObservedRunningTime="2026-04-17 14:10:28.490362818 +0000 UTC m=+172.126940208" Apr 17 14:10:30.454648 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:30.454616 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jscrk" Apr 17 14:10:38.228655 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.228596 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c67f685cd-pxwq5" podUID="39bceda9-a654-4e42-97bd-0e49222f2307" containerName="console" containerID="cri-o://7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749" gracePeriod=15 Apr 17 14:10:38.501887 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.501865 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c67f685cd-pxwq5_39bceda9-a654-4e42-97bd-0e49222f2307/console/0.log" Apr 17 14:10:38.501993 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.501953 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:10:38.503590 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.503570 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c67f685cd-pxwq5_39bceda9-a654-4e42-97bd-0e49222f2307/console/0.log" Apr 17 14:10:38.503693 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.503607 2568 generic.go:358] "Generic (PLEG): container finished" podID="39bceda9-a654-4e42-97bd-0e49222f2307" containerID="7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749" exitCode=2 Apr 17 14:10:38.503693 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.503666 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c67f685cd-pxwq5" Apr 17 14:10:38.503761 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.503672 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c67f685cd-pxwq5" event={"ID":"39bceda9-a654-4e42-97bd-0e49222f2307","Type":"ContainerDied","Data":"7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749"} Apr 17 14:10:38.503794 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.503761 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c67f685cd-pxwq5" event={"ID":"39bceda9-a654-4e42-97bd-0e49222f2307","Type":"ContainerDied","Data":"483adec973ac8e4489ba0e02d8ec3523ce6d06cc5487c927381b3a6392ded084"} Apr 17 14:10:38.503794 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.503775 2568 scope.go:117] "RemoveContainer" containerID="7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749" Apr 17 14:10:38.511211 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.511196 2568 scope.go:117] "RemoveContainer" containerID="7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749" Apr 17 14:10:38.511440 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:10:38.511421 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749\": container with ID starting with 7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749 not found: ID does not exist" containerID="7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749" Apr 17 14:10:38.511527 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.511448 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749"} err="failed to get container status \"7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749\": rpc error: code = NotFound desc = could not find container \"7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749\": container with ID starting with 7b88a472482852c1997d611f231fa8d5bf4315cfb7fb63d62e1c0add4fba9749 not found: ID does not exist" Apr 17 14:10:38.545610 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.545584 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-trusted-ca-bundle\") pod \"39bceda9-a654-4e42-97bd-0e49222f2307\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " Apr 17 14:10:38.545765 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.545619 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-serving-cert\") pod \"39bceda9-a654-4e42-97bd-0e49222f2307\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " Apr 17 14:10:38.545765 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.545669 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kchsk\" (UniqueName: \"kubernetes.io/projected/39bceda9-a654-4e42-97bd-0e49222f2307-kube-api-access-kchsk\") pod \"39bceda9-a654-4e42-97bd-0e49222f2307\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " Apr 17 14:10:38.545765 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.545698 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-oauth-serving-cert\") pod \"39bceda9-a654-4e42-97bd-0e49222f2307\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " Apr 17 14:10:38.545765 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.545734 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-service-ca\") pod \"39bceda9-a654-4e42-97bd-0e49222f2307\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " Apr 17 14:10:38.545959 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.545773 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-oauth-config\") pod \"39bceda9-a654-4e42-97bd-0e49222f2307\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " Apr 17 14:10:38.545959 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.545816 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-console-config\") pod \"39bceda9-a654-4e42-97bd-0e49222f2307\" (UID: \"39bceda9-a654-4e42-97bd-0e49222f2307\") " Apr 17 14:10:38.546118 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.546064 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "39bceda9-a654-4e42-97bd-0e49222f2307" (UID: "39bceda9-a654-4e42-97bd-0e49222f2307"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:10:38.546174 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.546144 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "39bceda9-a654-4e42-97bd-0e49222f2307" (UID: "39bceda9-a654-4e42-97bd-0e49222f2307"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:10:38.546290 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.546198 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-service-ca" (OuterVolumeSpecName: "service-ca") pod "39bceda9-a654-4e42-97bd-0e49222f2307" (UID: "39bceda9-a654-4e42-97bd-0e49222f2307"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:10:38.546391 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.546371 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-console-config" (OuterVolumeSpecName: "console-config") pod "39bceda9-a654-4e42-97bd-0e49222f2307" (UID: "39bceda9-a654-4e42-97bd-0e49222f2307"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:10:38.548105 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.548083 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "39bceda9-a654-4e42-97bd-0e49222f2307" (UID: "39bceda9-a654-4e42-97bd-0e49222f2307"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:10:38.548365 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.548347 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39bceda9-a654-4e42-97bd-0e49222f2307-kube-api-access-kchsk" (OuterVolumeSpecName: "kube-api-access-kchsk") pod "39bceda9-a654-4e42-97bd-0e49222f2307" (UID: "39bceda9-a654-4e42-97bd-0e49222f2307"). InnerVolumeSpecName "kube-api-access-kchsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:10:38.548365 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.548352 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "39bceda9-a654-4e42-97bd-0e49222f2307" (UID: "39bceda9-a654-4e42-97bd-0e49222f2307"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:10:38.646527 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.646469 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-trusted-ca-bundle\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:38.646527 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.646524 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-serving-cert\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:38.646736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.646541 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kchsk\" (UniqueName: \"kubernetes.io/projected/39bceda9-a654-4e42-97bd-0e49222f2307-kube-api-access-kchsk\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:38.646736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.646555 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-oauth-serving-cert\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:38.646736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.646568 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-service-ca\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:38.646736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.646579 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39bceda9-a654-4e42-97bd-0e49222f2307-console-oauth-config\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:38.646736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.646588 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39bceda9-a654-4e42-97bd-0e49222f2307-console-config\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:38.829324 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.829285 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c67f685cd-pxwq5"] Apr 17 14:10:38.833585 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.833552 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c67f685cd-pxwq5"] Apr 17 14:10:38.948844 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:38.948815 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39bceda9-a654-4e42-97bd-0e49222f2307" path="/var/lib/kubelet/pods/39bceda9-a654-4e42-97bd-0e49222f2307/volumes" Apr 17 14:10:41.514143 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:41.514110 2568 generic.go:358] "Generic (PLEG): container finished" podID="61e203fc-fb18-4d79-ace2-a4218ed4e9d9" containerID="6bece5bfe96943af3e9047154ba7946736d5c52697a1d35b6c4bcb645b51b500" exitCode=0 Apr 17 14:10:41.514538 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:41.514163 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rktb8" event={"ID":"61e203fc-fb18-4d79-ace2-a4218ed4e9d9","Type":"ContainerDied","Data":"6bece5bfe96943af3e9047154ba7946736d5c52697a1d35b6c4bcb645b51b500"} Apr 17 14:10:41.514538 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:41.514476 2568 scope.go:117] "RemoveContainer" containerID="6bece5bfe96943af3e9047154ba7946736d5c52697a1d35b6c4bcb645b51b500" Apr 17 14:10:42.495893 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:42.495854 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-684d7b5578-8dvb5_99c53420-8717-4539-b474-bdf6d9f5615a/router/0.log" Apr 17 14:10:42.511673 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:42.511649 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qf25s_b7c15494-fd82-415f-967e-b8bf2220ef8a/serve-healthcheck-canary/0.log" Apr 17 14:10:42.518497 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:42.518470 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rktb8" event={"ID":"61e203fc-fb18-4d79-ace2-a4218ed4e9d9","Type":"ContainerStarted","Data":"5df437214d5a41268d04f167b6b63bf3a047bba4e14bc265c1d7dd165c49327b"} Apr 17 14:10:49.531718 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.531673 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55c8fddf87-ppph7" podUID="9d5e0603-e15e-4533-a20e-826510c68901" containerName="console" containerID="cri-o://eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc" gracePeriod=15 Apr 17 14:10:49.794555 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.794533 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55c8fddf87-ppph7_9d5e0603-e15e-4533-a20e-826510c68901/console/0.log" Apr 17 14:10:49.794661 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.794593 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:49.831924 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.831895 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-oauth-config\") pod \"9d5e0603-e15e-4533-a20e-826510c68901\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " Apr 17 14:10:49.832068 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.831956 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgnmw\" (UniqueName: \"kubernetes.io/projected/9d5e0603-e15e-4533-a20e-826510c68901-kube-api-access-rgnmw\") pod \"9d5e0603-e15e-4533-a20e-826510c68901\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " Apr 17 14:10:49.832068 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.831989 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-trusted-ca-bundle\") pod \"9d5e0603-e15e-4533-a20e-826510c68901\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " Apr 17 14:10:49.832068 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.832015 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-service-ca\") pod \"9d5e0603-e15e-4533-a20e-826510c68901\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " Apr 17 14:10:49.832068 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.832051 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-oauth-serving-cert\") pod \"9d5e0603-e15e-4533-a20e-826510c68901\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " Apr 17 14:10:49.832277 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.832119 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-console-config\") pod \"9d5e0603-e15e-4533-a20e-826510c68901\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " Apr 17 14:10:49.832277 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.832149 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-serving-cert\") pod \"9d5e0603-e15e-4533-a20e-826510c68901\" (UID: \"9d5e0603-e15e-4533-a20e-826510c68901\") " Apr 17 14:10:49.832614 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.832578 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-console-config" (OuterVolumeSpecName: "console-config") pod "9d5e0603-e15e-4533-a20e-826510c68901" (UID: "9d5e0603-e15e-4533-a20e-826510c68901"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:10:49.832614 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.832601 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-service-ca" (OuterVolumeSpecName: "service-ca") pod "9d5e0603-e15e-4533-a20e-826510c68901" (UID: "9d5e0603-e15e-4533-a20e-826510c68901"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:10:49.832778 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.832546 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9d5e0603-e15e-4533-a20e-826510c68901" (UID: "9d5e0603-e15e-4533-a20e-826510c68901"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:10:49.832927 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.832902 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9d5e0603-e15e-4533-a20e-826510c68901" (UID: "9d5e0603-e15e-4533-a20e-826510c68901"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:10:49.834026 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.834001 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9d5e0603-e15e-4533-a20e-826510c68901" (UID: "9d5e0603-e15e-4533-a20e-826510c68901"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:10:49.834285 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.834262 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9d5e0603-e15e-4533-a20e-826510c68901" (UID: "9d5e0603-e15e-4533-a20e-826510c68901"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:10:49.834670 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.834650 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d5e0603-e15e-4533-a20e-826510c68901-kube-api-access-rgnmw" (OuterVolumeSpecName: "kube-api-access-rgnmw") pod "9d5e0603-e15e-4533-a20e-826510c68901" (UID: "9d5e0603-e15e-4533-a20e-826510c68901"). InnerVolumeSpecName "kube-api-access-rgnmw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:10:49.933237 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.933201 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-oauth-config\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:49.933237 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.933230 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rgnmw\" (UniqueName: \"kubernetes.io/projected/9d5e0603-e15e-4533-a20e-826510c68901-kube-api-access-rgnmw\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:49.933237 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.933241 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-trusted-ca-bundle\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:49.933469 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.933251 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-service-ca\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:49.933469 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.933260 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-oauth-serving-cert\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:49.933469 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.933269 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d5e0603-e15e-4533-a20e-826510c68901-console-config\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:49.933469 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:49.933278 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d5e0603-e15e-4533-a20e-826510c68901-console-serving-cert\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:10:50.542315 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:50.542289 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55c8fddf87-ppph7_9d5e0603-e15e-4533-a20e-826510c68901/console/0.log" Apr 17 14:10:50.542804 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:50.542327 2568 generic.go:358] "Generic (PLEG): container finished" podID="9d5e0603-e15e-4533-a20e-826510c68901" containerID="eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc" exitCode=2 Apr 17 14:10:50.542804 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:50.542359 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c8fddf87-ppph7" event={"ID":"9d5e0603-e15e-4533-a20e-826510c68901","Type":"ContainerDied","Data":"eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc"} Apr 17 14:10:50.542804 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:50.542390 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c8fddf87-ppph7" Apr 17 14:10:50.542804 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:50.542405 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c8fddf87-ppph7" event={"ID":"9d5e0603-e15e-4533-a20e-826510c68901","Type":"ContainerDied","Data":"0f24a19def597c91ebf47414abcd3e5181430cd9625012c2f5959570d228497e"} Apr 17 14:10:50.542804 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:50.542428 2568 scope.go:117] "RemoveContainer" containerID="eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc" Apr 17 14:10:50.551010 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:50.550990 2568 scope.go:117] "RemoveContainer" containerID="eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc" Apr 17 14:10:50.551302 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:10:50.551280 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc\": container with ID starting with eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc not found: ID does not exist" containerID="eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc" Apr 17 14:10:50.551356 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:50.551314 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc"} err="failed to get container status \"eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc\": rpc error: code = NotFound desc = could not find container \"eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc\": container with ID starting with eb396439109ecc6daa164ad04940eade7492dcb309999987a05d23094e4dbbdc not found: ID does not exist" Apr 17 14:10:50.562636 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:50.562609 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55c8fddf87-ppph7"] Apr 17 14:10:50.564608 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:50.564584 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55c8fddf87-ppph7"] Apr 17 14:10:50.949349 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:50.949316 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d5e0603-e15e-4533-a20e-826510c68901" path="/var/lib/kubelet/pods/9d5e0603-e15e-4533-a20e-826510c68901/volumes" Apr 17 14:10:54.562033 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:54.561689 2568 generic.go:358] "Generic (PLEG): container finished" podID="07693eac-becc-4d74-9e6b-24a018ef1f41" containerID="a7975e60b1e2c3f02a8e2b2d28e6726cec71df94f194c8ddb786f87d05d66291" exitCode=0 Apr 17 14:10:54.562033 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:54.561780 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" event={"ID":"07693eac-becc-4d74-9e6b-24a018ef1f41","Type":"ContainerDied","Data":"a7975e60b1e2c3f02a8e2b2d28e6726cec71df94f194c8ddb786f87d05d66291"} Apr 17 14:10:54.562462 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:54.562198 2568 scope.go:117] "RemoveContainer" containerID="a7975e60b1e2c3f02a8e2b2d28e6726cec71df94f194c8ddb786f87d05d66291" Apr 17 14:10:55.566254 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:10:55.566224 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-67kpd" event={"ID":"07693eac-becc-4d74-9e6b-24a018ef1f41","Type":"ContainerStarted","Data":"def436f3fa1c0379e7b6fb3d732a19820334515ef312c085699f38271e77d93d"} Apr 17 14:11:20.091838 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.091806 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:11:20.092265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.092078 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d5e0603-e15e-4533-a20e-826510c68901" containerName="console" Apr 17 14:11:20.092265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.092088 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5e0603-e15e-4533-a20e-826510c68901" containerName="console" Apr 17 14:11:20.092265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.092098 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39bceda9-a654-4e42-97bd-0e49222f2307" containerName="console" Apr 17 14:11:20.092265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.092104 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="39bceda9-a654-4e42-97bd-0e49222f2307" containerName="console" Apr 17 14:11:20.092265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.092156 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="39bceda9-a654-4e42-97bd-0e49222f2307" containerName="console" Apr 17 14:11:20.092265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.092165 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d5e0603-e15e-4533-a20e-826510c68901" containerName="console" Apr 17 14:11:20.095576 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.095560 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.097831 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.097809 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-gxxxk\"" Apr 17 14:11:20.098076 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.098048 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 14:11:20.098184 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.098049 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 14:11:20.098184 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.098048 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 14:11:20.098184 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.098116 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 14:11:20.098184 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.098088 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 14:11:20.098379 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.098365 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 14:11:20.098574 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.098558 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 14:11:20.098647 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.098611 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 14:11:20.103031 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.103006 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 14:11:20.104882 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.104860 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:11:20.178752 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.178711 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4c8k\" (UniqueName: \"kubernetes.io/projected/1a08c066-6613-4a23-8e09-6aea0328fb2b-kube-api-access-z4c8k\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.178942 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.178762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.178942 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.178785 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a08c066-6613-4a23-8e09-6aea0328fb2b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.178942 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.178826 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a08c066-6613-4a23-8e09-6aea0328fb2b-config-out\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.178942 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.178855 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.178942 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.178901 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.178942 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.178937 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.179189 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.178969 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a08c066-6613-4a23-8e09-6aea0328fb2b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.179189 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.179030 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a08c066-6613-4a23-8e09-6aea0328fb2b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.179189 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.179064 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-web-config\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.179189 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.179087 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.179189 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.179112 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.179189 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.179176 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a08c066-6613-4a23-8e09-6aea0328fb2b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.279818 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.279787 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a08c066-6613-4a23-8e09-6aea0328fb2b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.279966 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.279848 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-web-config\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.279966 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.279878 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.279966 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.279911 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.279966 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.279947 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a08c066-6613-4a23-8e09-6aea0328fb2b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.280223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.279976 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4c8k\" (UniqueName: \"kubernetes.io/projected/1a08c066-6613-4a23-8e09-6aea0328fb2b-kube-api-access-z4c8k\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.280223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.280016 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.280223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.280040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a08c066-6613-4a23-8e09-6aea0328fb2b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.280223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.280065 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a08c066-6613-4a23-8e09-6aea0328fb2b-config-out\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.280223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.280091 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.280223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.280119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.280223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.280152 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.280223 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.280189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a08c066-6613-4a23-8e09-6aea0328fb2b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.280631 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.280228 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a08c066-6613-4a23-8e09-6aea0328fb2b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.280715 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.280690 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a08c066-6613-4a23-8e09-6aea0328fb2b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.283110 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.283076 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.283234 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.283120 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-web-config\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.283234 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.283201 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.283864 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.283836 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.283977 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.283911 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.283977 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.283934 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a08c066-6613-4a23-8e09-6aea0328fb2b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.284060 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.284004 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.284250 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.284228 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a08c066-6613-4a23-8e09-6aea0328fb2b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.284371 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.284357 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a08c066-6613-4a23-8e09-6aea0328fb2b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.285613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.285596 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a08c066-6613-4a23-8e09-6aea0328fb2b-config-out\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.287342 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.287318 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4c8k\" (UniqueName: \"kubernetes.io/projected/1a08c066-6613-4a23-8e09-6aea0328fb2b-kube-api-access-z4c8k\") pod \"alertmanager-main-0\" (UID: \"1a08c066-6613-4a23-8e09-6aea0328fb2b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.405955 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.405869 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 14:11:20.545688 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.545660 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 14:11:20.547850 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:11:20.547816 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a08c066_6613_4a23_8e09_6aea0328fb2b.slice/crio-fdc97d1c2fd5c3bb861f787de51e9358d187a2a24f390f4fffe84d3c469dce3a WatchSource:0}: Error finding container fdc97d1c2fd5c3bb861f787de51e9358d187a2a24f390f4fffe84d3c469dce3a: Status 404 returned error can't find the container with id fdc97d1c2fd5c3bb861f787de51e9358d187a2a24f390f4fffe84d3c469dce3a Apr 17 14:11:20.639582 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.639537 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a08c066-6613-4a23-8e09-6aea0328fb2b","Type":"ContainerDied","Data":"32252cd38496a37debb7b53dd351e41673bbf95d7f431d1b7017d9789b253888"} Apr 17 14:11:20.639699 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.639492 2568 generic.go:358] "Generic (PLEG): container finished" podID="1a08c066-6613-4a23-8e09-6aea0328fb2b" containerID="32252cd38496a37debb7b53dd351e41673bbf95d7f431d1b7017d9789b253888" exitCode=0 Apr 17 14:11:20.639699 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:20.639623 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a08c066-6613-4a23-8e09-6aea0328fb2b","Type":"ContainerStarted","Data":"fdc97d1c2fd5c3bb861f787de51e9358d187a2a24f390f4fffe84d3c469dce3a"} Apr 17 14:11:21.989126 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:21.989085 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-868dd77dc-jq584"] Apr 17 14:11:21.992995 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:21.992974 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.003998 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.003971 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-868dd77dc-jq584"] Apr 17 14:11:22.097713 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.097683 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-oauth-config\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.097814 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.097730 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjwz9\" (UniqueName: \"kubernetes.io/projected/b5da1956-b0d8-4bdf-a5a5-abb04571309c-kube-api-access-cjwz9\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.097814 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.097803 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-serving-cert\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.097900 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.097833 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-service-ca\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.097900 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.097868 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-oauth-serving-cert\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.097982 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.097964 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-config\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.098021 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.097992 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-trusted-ca-bundle\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.198757 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.198732 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-serving-cert\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.198896 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.198767 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-service-ca\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.198949 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.198891 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-oauth-serving-cert\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.199001 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.198963 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-config\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.199001 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.198981 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-trusted-ca-bundle\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.199091 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.199040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-oauth-config\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.199091 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.199065 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjwz9\" (UniqueName: \"kubernetes.io/projected/b5da1956-b0d8-4bdf-a5a5-abb04571309c-kube-api-access-cjwz9\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.199612 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.199585 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-service-ca\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.199612 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.199601 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-oauth-serving-cert\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.199797 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.199608 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-config\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.199797 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.199775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-trusted-ca-bundle\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.201293 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.201267 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-serving-cert\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.201467 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.201448 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-oauth-config\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.212602 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.212584 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjwz9\" (UniqueName: \"kubernetes.io/projected/b5da1956-b0d8-4bdf-a5a5-abb04571309c-kube-api-access-cjwz9\") pod \"console-868dd77dc-jq584\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.315034 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.314998 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:22.433359 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.433330 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-868dd77dc-jq584"] Apr 17 14:11:22.435854 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:11:22.435826 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5da1956_b0d8_4bdf_a5a5_abb04571309c.slice/crio-ae9fce0b8315f0c3847c18a7970b6dba9703455692e4b30a05fc80e8d258dc0c WatchSource:0}: Error finding container ae9fce0b8315f0c3847c18a7970b6dba9703455692e4b30a05fc80e8d258dc0c: Status 404 returned error can't find the container with id ae9fce0b8315f0c3847c18a7970b6dba9703455692e4b30a05fc80e8d258dc0c Apr 17 14:11:22.647843 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.647804 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868dd77dc-jq584" event={"ID":"b5da1956-b0d8-4bdf-a5a5-abb04571309c","Type":"ContainerStarted","Data":"7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8"} Apr 17 14:11:22.647843 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.647847 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868dd77dc-jq584" event={"ID":"b5da1956-b0d8-4bdf-a5a5-abb04571309c","Type":"ContainerStarted","Data":"ae9fce0b8315f0c3847c18a7970b6dba9703455692e4b30a05fc80e8d258dc0c"} Apr 17 14:11:22.650371 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.650346 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a08c066-6613-4a23-8e09-6aea0328fb2b","Type":"ContainerStarted","Data":"85c97c534ad5fe5d3c1a9acf044e8b673f7e9b99c225dc2d424ea75f84bbcf01"} Apr 17 14:11:22.650371 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.650376 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a08c066-6613-4a23-8e09-6aea0328fb2b","Type":"ContainerStarted","Data":"f28ad33d7b19886ab06a0c4c662a5b7de813d20ee0872e09f193051777812036"} Apr 17 14:11:22.650560 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.650385 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a08c066-6613-4a23-8e09-6aea0328fb2b","Type":"ContainerStarted","Data":"ff2b9be4d8e01af3b7210c9a2254009658be5bc58f9bdbe5c7583d0605259063"} Apr 17 14:11:22.650560 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.650395 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a08c066-6613-4a23-8e09-6aea0328fb2b","Type":"ContainerStarted","Data":"c86de1bb39be294de7bc82afceedd58d60bdb7f69f291c5dadb8fbf579cc111f"} Apr 17 14:11:22.650560 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.650406 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a08c066-6613-4a23-8e09-6aea0328fb2b","Type":"ContainerStarted","Data":"bca39d97b9c8a44dfce9dddf7d6239f096ebf9b9ffa2473ce3509e7ce40d29a2"} Apr 17 14:11:22.664160 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:22.664120 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-868dd77dc-jq584" podStartSLOduration=1.6641053179999998 podStartE2EDuration="1.664105318s" podCreationTimestamp="2026-04-17 14:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:11:22.662667347 +0000 UTC m=+226.299244739" watchObservedRunningTime="2026-04-17 14:11:22.664105318 +0000 UTC m=+226.300682707" Apr 17 14:11:23.656800 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:23.656769 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a08c066-6613-4a23-8e09-6aea0328fb2b","Type":"ContainerStarted","Data":"2b4c88e2ed9397c5cf76e38ff64921701398b01152e22b652cd62c729ef4c2b6"} Apr 17 14:11:23.684141 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:23.684082 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.202217593 podStartE2EDuration="3.684043873s" podCreationTimestamp="2026-04-17 14:11:20 +0000 UTC" firstStartedPulling="2026-04-17 14:11:20.640750634 +0000 UTC m=+224.277328002" lastFinishedPulling="2026-04-17 14:11:23.122576914 +0000 UTC m=+226.759154282" observedRunningTime="2026-04-17 14:11:23.681633022 +0000 UTC m=+227.318210414" watchObservedRunningTime="2026-04-17 14:11:23.684043873 +0000 UTC m=+227.320621264" Apr 17 14:11:32.315751 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:32.315663 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:32.316280 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:32.315767 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:32.320473 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:32.320443 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:32.689570 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:32.689546 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:11:32.735239 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:32.735212 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56f6f67987-66d9d"] Apr 17 14:11:57.754287 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:57.754201 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56f6f67987-66d9d" podUID="00100943-311e-435f-ada2-e95dae3bf92f" containerName="console" containerID="cri-o://105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124" gracePeriod=15 Apr 17 14:11:57.984099 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:57.984080 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56f6f67987-66d9d_00100943-311e-435f-ada2-e95dae3bf92f/console/0.log" Apr 17 14:11:57.984198 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:57.984138 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:11:58.090705 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.090634 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-service-ca\") pod \"00100943-311e-435f-ada2-e95dae3bf92f\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " Apr 17 14:11:58.090705 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.090673 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-serving-cert\") pod \"00100943-311e-435f-ada2-e95dae3bf92f\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " Apr 17 14:11:58.090914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.090733 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-oauth-serving-cert\") pod \"00100943-311e-435f-ada2-e95dae3bf92f\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " Apr 17 14:11:58.090914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.090749 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-trusted-ca-bundle\") pod \"00100943-311e-435f-ada2-e95dae3bf92f\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " Apr 17 14:11:58.090914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.090770 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72pwv\" (UniqueName: \"kubernetes.io/projected/00100943-311e-435f-ada2-e95dae3bf92f-kube-api-access-72pwv\") pod \"00100943-311e-435f-ada2-e95dae3bf92f\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " Apr 17 14:11:58.090914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.090807 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-oauth-config\") pod \"00100943-311e-435f-ada2-e95dae3bf92f\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " Apr 17 14:11:58.090914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.090840 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-console-config\") pod \"00100943-311e-435f-ada2-e95dae3bf92f\" (UID: \"00100943-311e-435f-ada2-e95dae3bf92f\") " Apr 17 14:11:58.091153 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.091057 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-service-ca" (OuterVolumeSpecName: "service-ca") pod "00100943-311e-435f-ada2-e95dae3bf92f" (UID: "00100943-311e-435f-ada2-e95dae3bf92f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:11:58.091153 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.091074 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "00100943-311e-435f-ada2-e95dae3bf92f" (UID: "00100943-311e-435f-ada2-e95dae3bf92f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:11:58.091439 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.091405 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-console-config" (OuterVolumeSpecName: "console-config") pod "00100943-311e-435f-ada2-e95dae3bf92f" (UID: "00100943-311e-435f-ada2-e95dae3bf92f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:11:58.091439 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.091424 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "00100943-311e-435f-ada2-e95dae3bf92f" (UID: "00100943-311e-435f-ada2-e95dae3bf92f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:11:58.092996 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.092972 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "00100943-311e-435f-ada2-e95dae3bf92f" (UID: "00100943-311e-435f-ada2-e95dae3bf92f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:11:58.093082 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.092992 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "00100943-311e-435f-ada2-e95dae3bf92f" (UID: "00100943-311e-435f-ada2-e95dae3bf92f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:11:58.093082 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.093039 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00100943-311e-435f-ada2-e95dae3bf92f-kube-api-access-72pwv" (OuterVolumeSpecName: "kube-api-access-72pwv") pod "00100943-311e-435f-ada2-e95dae3bf92f" (UID: "00100943-311e-435f-ada2-e95dae3bf92f"). InnerVolumeSpecName "kube-api-access-72pwv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:11:58.191772 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.191740 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-oauth-config\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:11:58.191772 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.191768 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-console-config\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:11:58.191772 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.191778 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-service-ca\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:11:58.191992 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.191787 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00100943-311e-435f-ada2-e95dae3bf92f-console-serving-cert\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:11:58.191992 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.191796 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-oauth-serving-cert\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:11:58.191992 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.191804 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00100943-311e-435f-ada2-e95dae3bf92f-trusted-ca-bundle\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:11:58.191992 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.191812 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72pwv\" (UniqueName: \"kubernetes.io/projected/00100943-311e-435f-ada2-e95dae3bf92f-kube-api-access-72pwv\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:11:58.672709 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.672679 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-c2rcv"] Apr 17 14:11:58.672976 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.672964 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00100943-311e-435f-ada2-e95dae3bf92f" containerName="console" Apr 17 14:11:58.673027 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.672978 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="00100943-311e-435f-ada2-e95dae3bf92f" containerName="console" Apr 17 14:11:58.673067 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.673032 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="00100943-311e-435f-ada2-e95dae3bf92f" containerName="console" Apr 17 14:11:58.677651 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.677632 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2rcv" Apr 17 14:11:58.679925 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.679895 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:11:58.681713 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.681692 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c2rcv"] Apr 17 14:11:58.757749 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.757729 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56f6f67987-66d9d_00100943-311e-435f-ada2-e95dae3bf92f/console/0.log" Apr 17 14:11:58.758116 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.757762 2568 generic.go:358] "Generic (PLEG): container finished" podID="00100943-311e-435f-ada2-e95dae3bf92f" containerID="105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124" exitCode=2 Apr 17 14:11:58.758116 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.757829 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56f6f67987-66d9d" event={"ID":"00100943-311e-435f-ada2-e95dae3bf92f","Type":"ContainerDied","Data":"105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124"} Apr 17 14:11:58.758116 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.757835 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56f6f67987-66d9d" Apr 17 14:11:58.758116 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.757851 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56f6f67987-66d9d" event={"ID":"00100943-311e-435f-ada2-e95dae3bf92f","Type":"ContainerDied","Data":"b4feae1d3943d7e570033aeeca0d9591a114f8aaaf3aaad7367806fea122284d"} Apr 17 14:11:58.758116 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.757865 2568 scope.go:117] "RemoveContainer" containerID="105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124" Apr 17 14:11:58.765478 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.765454 2568 scope.go:117] "RemoveContainer" containerID="105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124" Apr 17 14:11:58.765774 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:11:58.765756 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124\": container with ID starting with 105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124 not found: ID does not exist" containerID="105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124" Apr 17 14:11:58.765838 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.765782 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124"} err="failed to get container status \"105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124\": rpc error: code = NotFound desc = could not find container \"105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124\": container with ID starting with 105f83d6dc22a4a46b00feb6bdbbed64a17dd65b8d8ac01fb22389c453145124 not found: ID does not exist" Apr 17 14:11:58.795542 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.795289 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56f6f67987-66d9d"] Apr 17 14:11:58.798429 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.798397 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56f6f67987-66d9d"] Apr 17 14:11:58.799186 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.799160 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ff15f2af-05a6-4805-9ff7-0b8f823a3c0a-kubelet-config\") pod \"global-pull-secret-syncer-c2rcv\" (UID: \"ff15f2af-05a6-4805-9ff7-0b8f823a3c0a\") " pod="kube-system/global-pull-secret-syncer-c2rcv" Apr 17 14:11:58.799421 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.799402 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ff15f2af-05a6-4805-9ff7-0b8f823a3c0a-dbus\") pod \"global-pull-secret-syncer-c2rcv\" (UID: \"ff15f2af-05a6-4805-9ff7-0b8f823a3c0a\") " pod="kube-system/global-pull-secret-syncer-c2rcv" Apr 17 14:11:58.799546 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.799531 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ff15f2af-05a6-4805-9ff7-0b8f823a3c0a-original-pull-secret\") pod \"global-pull-secret-syncer-c2rcv\" (UID: \"ff15f2af-05a6-4805-9ff7-0b8f823a3c0a\") " pod="kube-system/global-pull-secret-syncer-c2rcv" Apr 17 14:11:58.900123 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.900087 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ff15f2af-05a6-4805-9ff7-0b8f823a3c0a-kubelet-config\") pod \"global-pull-secret-syncer-c2rcv\" (UID: \"ff15f2af-05a6-4805-9ff7-0b8f823a3c0a\") " pod="kube-system/global-pull-secret-syncer-c2rcv" Apr 17 14:11:58.900283 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.900163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ff15f2af-05a6-4805-9ff7-0b8f823a3c0a-dbus\") pod \"global-pull-secret-syncer-c2rcv\" (UID: \"ff15f2af-05a6-4805-9ff7-0b8f823a3c0a\") " pod="kube-system/global-pull-secret-syncer-c2rcv" Apr 17 14:11:58.900283 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.900185 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ff15f2af-05a6-4805-9ff7-0b8f823a3c0a-original-pull-secret\") pod \"global-pull-secret-syncer-c2rcv\" (UID: \"ff15f2af-05a6-4805-9ff7-0b8f823a3c0a\") " pod="kube-system/global-pull-secret-syncer-c2rcv" Apr 17 14:11:58.900283 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.900218 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ff15f2af-05a6-4805-9ff7-0b8f823a3c0a-kubelet-config\") pod \"global-pull-secret-syncer-c2rcv\" (UID: \"ff15f2af-05a6-4805-9ff7-0b8f823a3c0a\") " pod="kube-system/global-pull-secret-syncer-c2rcv" Apr 17 14:11:58.900392 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.900331 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ff15f2af-05a6-4805-9ff7-0b8f823a3c0a-dbus\") pod \"global-pull-secret-syncer-c2rcv\" (UID: \"ff15f2af-05a6-4805-9ff7-0b8f823a3c0a\") " pod="kube-system/global-pull-secret-syncer-c2rcv" Apr 17 14:11:58.902338 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.902318 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ff15f2af-05a6-4805-9ff7-0b8f823a3c0a-original-pull-secret\") pod \"global-pull-secret-syncer-c2rcv\" (UID: \"ff15f2af-05a6-4805-9ff7-0b8f823a3c0a\") " pod="kube-system/global-pull-secret-syncer-c2rcv" Apr 17 14:11:58.949371 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.949303 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00100943-311e-435f-ada2-e95dae3bf92f" path="/var/lib/kubelet/pods/00100943-311e-435f-ada2-e95dae3bf92f/volumes" Apr 17 14:11:58.987412 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:58.987388 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-c2rcv" Apr 17 14:11:59.101783 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:59.101757 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-c2rcv"] Apr 17 14:11:59.104295 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:11:59.104268 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff15f2af_05a6_4805_9ff7_0b8f823a3c0a.slice/crio-2d75c162b3d38d5915372b6dace3d55aab82ffb125b289069aff9f4940d285ad WatchSource:0}: Error finding container 2d75c162b3d38d5915372b6dace3d55aab82ffb125b289069aff9f4940d285ad: Status 404 returned error can't find the container with id 2d75c162b3d38d5915372b6dace3d55aab82ffb125b289069aff9f4940d285ad Apr 17 14:11:59.762912 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:11:59.762876 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c2rcv" event={"ID":"ff15f2af-05a6-4805-9ff7-0b8f823a3c0a","Type":"ContainerStarted","Data":"2d75c162b3d38d5915372b6dace3d55aab82ffb125b289069aff9f4940d285ad"} Apr 17 14:12:03.775723 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:03.775684 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-c2rcv" event={"ID":"ff15f2af-05a6-4805-9ff7-0b8f823a3c0a","Type":"ContainerStarted","Data":"f308a354c841ffb1f79f6945d6bae3ed7ff4efb91d2b2226567eac56a161580a"} Apr 17 14:12:03.790057 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:03.790011 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-c2rcv" podStartSLOduration=1.5929664730000002 podStartE2EDuration="5.789997267s" podCreationTimestamp="2026-04-17 14:11:58 +0000 UTC" firstStartedPulling="2026-04-17 14:11:59.105836883 +0000 UTC m=+262.742414251" lastFinishedPulling="2026-04-17 14:12:03.302867669 +0000 UTC m=+266.939445045" observedRunningTime="2026-04-17 14:12:03.788643144 +0000 UTC m=+267.425220533" watchObservedRunningTime="2026-04-17 14:12:03.789997267 +0000 UTC m=+267.426574659" Apr 17 14:12:29.865055 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:29.865020 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74"] Apr 17 14:12:29.870850 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:29.870830 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:29.873807 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:29.873783 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hq8kl\"" Apr 17 14:12:29.873920 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:29.873789 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:12:29.873920 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:29.873825 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:12:29.874539 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:29.874518 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74"] Apr 17 14:12:29.959565 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:29.959530 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:29.959723 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:29.959592 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:29.959723 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:29.959636 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mzmn\" (UniqueName: \"kubernetes.io/projected/177a1231-799a-4238-9531-e6de3762d007-kube-api-access-2mzmn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:30.060328 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:30.060295 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:30.060438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:30.060341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:30.060438 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:30.060374 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mzmn\" (UniqueName: \"kubernetes.io/projected/177a1231-799a-4238-9531-e6de3762d007-kube-api-access-2mzmn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:30.060732 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:30.060713 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:30.060779 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:30.060726 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:30.068294 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:30.068266 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mzmn\" (UniqueName: \"kubernetes.io/projected/177a1231-799a-4238-9531-e6de3762d007-kube-api-access-2mzmn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:30.181431 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:30.181399 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:30.300395 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:30.300367 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74"] Apr 17 14:12:30.304409 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:12:30.304377 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod177a1231_799a_4238_9531_e6de3762d007.slice/crio-53e05f2af1257ebb171e2d7eb61913d45a41f30b9580bcb58acb5fe566d6d3fa WatchSource:0}: Error finding container 53e05f2af1257ebb171e2d7eb61913d45a41f30b9580bcb58acb5fe566d6d3fa: Status 404 returned error can't find the container with id 53e05f2af1257ebb171e2d7eb61913d45a41f30b9580bcb58acb5fe566d6d3fa Apr 17 14:12:30.849309 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:30.849273 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" event={"ID":"177a1231-799a-4238-9531-e6de3762d007","Type":"ContainerStarted","Data":"53e05f2af1257ebb171e2d7eb61913d45a41f30b9580bcb58acb5fe566d6d3fa"} Apr 17 14:12:37.113389 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:37.113333 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 14:12:37.872621 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:37.872586 2568 generic.go:358] "Generic (PLEG): container finished" podID="177a1231-799a-4238-9531-e6de3762d007" containerID="41270b86e57a0b48f2a8a9cec44751f09e5755120ae422b21580f37ae7a0dc72" exitCode=0 Apr 17 14:12:37.872787 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:37.872674 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" event={"ID":"177a1231-799a-4238-9531-e6de3762d007","Type":"ContainerDied","Data":"41270b86e57a0b48f2a8a9cec44751f09e5755120ae422b21580f37ae7a0dc72"} Apr 17 14:12:40.884004 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:40.883968 2568 generic.go:358] "Generic (PLEG): container finished" podID="177a1231-799a-4238-9531-e6de3762d007" containerID="dc9696874d6cf22dbee6be9ab321e953c36d9f4a7d4d107253bffe4c2c8e3295" exitCode=0 Apr 17 14:12:40.884392 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:40.884053 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" event={"ID":"177a1231-799a-4238-9531-e6de3762d007","Type":"ContainerDied","Data":"dc9696874d6cf22dbee6be9ab321e953c36d9f4a7d4d107253bffe4c2c8e3295"} Apr 17 14:12:40.885004 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:40.884987 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:12:50.914350 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:50.914317 2568 generic.go:358] "Generic (PLEG): container finished" podID="177a1231-799a-4238-9531-e6de3762d007" containerID="7b64ace77ac4a3a608c81efd31908c472f7912d73ee032842429c61cd694db6a" exitCode=0 Apr 17 14:12:50.914717 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:50.914358 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" event={"ID":"177a1231-799a-4238-9531-e6de3762d007","Type":"ContainerDied","Data":"7b64ace77ac4a3a608c81efd31908c472f7912d73ee032842429c61cd694db6a"} Apr 17 14:12:52.048280 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.048256 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:52.156570 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.156501 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-bundle\") pod \"177a1231-799a-4238-9531-e6de3762d007\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " Apr 17 14:12:52.156745 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.156632 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-util\") pod \"177a1231-799a-4238-9531-e6de3762d007\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " Apr 17 14:12:52.156745 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.156688 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mzmn\" (UniqueName: \"kubernetes.io/projected/177a1231-799a-4238-9531-e6de3762d007-kube-api-access-2mzmn\") pod \"177a1231-799a-4238-9531-e6de3762d007\" (UID: \"177a1231-799a-4238-9531-e6de3762d007\") " Apr 17 14:12:52.157161 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.157135 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-bundle" (OuterVolumeSpecName: "bundle") pod "177a1231-799a-4238-9531-e6de3762d007" (UID: "177a1231-799a-4238-9531-e6de3762d007"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:12:52.158914 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.158884 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177a1231-799a-4238-9531-e6de3762d007-kube-api-access-2mzmn" (OuterVolumeSpecName: "kube-api-access-2mzmn") pod "177a1231-799a-4238-9531-e6de3762d007" (UID: "177a1231-799a-4238-9531-e6de3762d007"). InnerVolumeSpecName "kube-api-access-2mzmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:12:52.161461 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.161437 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-util" (OuterVolumeSpecName: "util") pod "177a1231-799a-4238-9531-e6de3762d007" (UID: "177a1231-799a-4238-9531-e6de3762d007"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:12:52.257669 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.257581 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-util\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:12:52.257669 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.257625 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2mzmn\" (UniqueName: \"kubernetes.io/projected/177a1231-799a-4238-9531-e6de3762d007-kube-api-access-2mzmn\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:12:52.257669 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.257638 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/177a1231-799a-4238-9531-e6de3762d007-bundle\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:12:52.920830 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.920790 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" event={"ID":"177a1231-799a-4238-9531-e6de3762d007","Type":"ContainerDied","Data":"53e05f2af1257ebb171e2d7eb61913d45a41f30b9580bcb58acb5fe566d6d3fa"} Apr 17 14:12:52.920830 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.920828 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53e05f2af1257ebb171e2d7eb61913d45a41f30b9580bcb58acb5fe566d6d3fa" Apr 17 14:12:52.920830 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:52.920807 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59gz74" Apr 17 14:12:57.010244 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.010207 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9"] Apr 17 14:12:57.010622 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.010543 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="177a1231-799a-4238-9531-e6de3762d007" containerName="pull" Apr 17 14:12:57.010622 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.010556 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="177a1231-799a-4238-9531-e6de3762d007" containerName="pull" Apr 17 14:12:57.010622 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.010572 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="177a1231-799a-4238-9531-e6de3762d007" containerName="util" Apr 17 14:12:57.010622 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.010577 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="177a1231-799a-4238-9531-e6de3762d007" containerName="util" Apr 17 14:12:57.010622 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.010582 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="177a1231-799a-4238-9531-e6de3762d007" containerName="extract" Apr 17 14:12:57.010622 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.010587 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="177a1231-799a-4238-9531-e6de3762d007" containerName="extract" Apr 17 14:12:57.010797 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.010638 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="177a1231-799a-4238-9531-e6de3762d007" containerName="extract" Apr 17 14:12:57.014583 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.014567 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9" Apr 17 14:12:57.016587 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.016563 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 14:12:57.016700 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.016616 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-cf529\"" Apr 17 14:12:57.016700 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.016674 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:12:57.021770 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.021449 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9"] Apr 17 14:12:57.098336 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.098303 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cn5w\" (UniqueName: \"kubernetes.io/projected/01541732-1fa9-4dd6-9912-d98b5ecdbd66-kube-api-access-2cn5w\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-mp6h9\" (UID: \"01541732-1fa9-4dd6-9912-d98b5ecdbd66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9" Apr 17 14:12:57.098497 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.098374 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/01541732-1fa9-4dd6-9912-d98b5ecdbd66-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-mp6h9\" (UID: \"01541732-1fa9-4dd6-9912-d98b5ecdbd66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9" Apr 17 14:12:57.198947 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.198910 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cn5w\" (UniqueName: \"kubernetes.io/projected/01541732-1fa9-4dd6-9912-d98b5ecdbd66-kube-api-access-2cn5w\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-mp6h9\" (UID: \"01541732-1fa9-4dd6-9912-d98b5ecdbd66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9" Apr 17 14:12:57.199104 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.198989 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/01541732-1fa9-4dd6-9912-d98b5ecdbd66-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-mp6h9\" (UID: \"01541732-1fa9-4dd6-9912-d98b5ecdbd66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9" Apr 17 14:12:57.199326 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.199310 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/01541732-1fa9-4dd6-9912-d98b5ecdbd66-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-mp6h9\" (UID: \"01541732-1fa9-4dd6-9912-d98b5ecdbd66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9" Apr 17 14:12:57.206227 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.206200 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cn5w\" (UniqueName: \"kubernetes.io/projected/01541732-1fa9-4dd6-9912-d98b5ecdbd66-kube-api-access-2cn5w\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-mp6h9\" (UID: \"01541732-1fa9-4dd6-9912-d98b5ecdbd66\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9" Apr 17 14:12:57.325924 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.325833 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9" Apr 17 14:12:57.445756 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.445733 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9"] Apr 17 14:12:57.448335 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:12:57.448309 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01541732_1fa9_4dd6_9912_d98b5ecdbd66.slice/crio-1c76bee28ed325858ccac8f0e75249605e9b5103f2964ba44d3be064b23397eb WatchSource:0}: Error finding container 1c76bee28ed325858ccac8f0e75249605e9b5103f2964ba44d3be064b23397eb: Status 404 returned error can't find the container with id 1c76bee28ed325858ccac8f0e75249605e9b5103f2964ba44d3be064b23397eb Apr 17 14:12:57.937244 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:57.937207 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9" event={"ID":"01541732-1fa9-4dd6-9912-d98b5ecdbd66","Type":"ContainerStarted","Data":"1c76bee28ed325858ccac8f0e75249605e9b5103f2964ba44d3be064b23397eb"} Apr 17 14:12:59.945972 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:59.945934 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9" event={"ID":"01541732-1fa9-4dd6-9912-d98b5ecdbd66","Type":"ContainerStarted","Data":"a391e5c4cd15b1ebc38d97c0dd0d34bef0734bce4807c065e9fe1b7a96bc08d0"} Apr 17 14:12:59.965240 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:12:59.965192 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-mp6h9" podStartSLOduration=1.990260602 podStartE2EDuration="3.965179565s" podCreationTimestamp="2026-04-17 14:12:56 +0000 UTC" firstStartedPulling="2026-04-17 14:12:57.451216504 +0000 UTC m=+321.087793873" lastFinishedPulling="2026-04-17 14:12:59.426135467 +0000 UTC m=+323.062712836" observedRunningTime="2026-04-17 14:12:59.964769608 +0000 UTC m=+323.601346999" watchObservedRunningTime="2026-04-17 14:12:59.965179565 +0000 UTC m=+323.601756965" Apr 17 14:13:03.240600 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.240561 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-n8jc2"] Apr 17 14:13:03.243786 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.243765 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" Apr 17 14:13:03.247265 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.247244 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 14:13:03.248049 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.248025 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-p6t9f\"" Apr 17 14:13:03.248148 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.248076 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 14:13:03.259336 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.259305 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-n8jc2"] Apr 17 14:13:03.348863 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.348832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cgw4\" (UniqueName: \"kubernetes.io/projected/8f437367-668d-4a5d-8af2-2f4029f87b29-kube-api-access-6cgw4\") pod \"cert-manager-webhook-597b96b99b-n8jc2\" (UID: \"8f437367-668d-4a5d-8af2-2f4029f87b29\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" Apr 17 14:13:03.349053 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.348869 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f437367-668d-4a5d-8af2-2f4029f87b29-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-n8jc2\" (UID: \"8f437367-668d-4a5d-8af2-2f4029f87b29\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" Apr 17 14:13:03.449409 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.449375 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cgw4\" (UniqueName: \"kubernetes.io/projected/8f437367-668d-4a5d-8af2-2f4029f87b29-kube-api-access-6cgw4\") pod \"cert-manager-webhook-597b96b99b-n8jc2\" (UID: \"8f437367-668d-4a5d-8af2-2f4029f87b29\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" Apr 17 14:13:03.449624 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.449421 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f437367-668d-4a5d-8af2-2f4029f87b29-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-n8jc2\" (UID: \"8f437367-668d-4a5d-8af2-2f4029f87b29\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" Apr 17 14:13:03.456766 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.456733 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f437367-668d-4a5d-8af2-2f4029f87b29-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-n8jc2\" (UID: \"8f437367-668d-4a5d-8af2-2f4029f87b29\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" Apr 17 14:13:03.456892 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.456863 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cgw4\" (UniqueName: \"kubernetes.io/projected/8f437367-668d-4a5d-8af2-2f4029f87b29-kube-api-access-6cgw4\") pod \"cert-manager-webhook-597b96b99b-n8jc2\" (UID: \"8f437367-668d-4a5d-8af2-2f4029f87b29\") " pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" Apr 17 14:13:03.563707 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.563618 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" Apr 17 14:13:03.682346 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.682322 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-n8jc2"] Apr 17 14:13:03.684693 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:13:03.684666 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f437367_668d_4a5d_8af2_2f4029f87b29.slice/crio-d0be52dd86f104708248bbe19322c1e1177cf3ff8401cdb1a8cf44b0168ac1a5 WatchSource:0}: Error finding container d0be52dd86f104708248bbe19322c1e1177cf3ff8401cdb1a8cf44b0168ac1a5: Status 404 returned error can't find the container with id d0be52dd86f104708248bbe19322c1e1177cf3ff8401cdb1a8cf44b0168ac1a5 Apr 17 14:13:03.959576 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:03.959537 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" event={"ID":"8f437367-668d-4a5d-8af2-2f4029f87b29","Type":"ContainerStarted","Data":"d0be52dd86f104708248bbe19322c1e1177cf3ff8401cdb1a8cf44b0168ac1a5"} Apr 17 14:13:06.973249 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:06.973217 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" event={"ID":"8f437367-668d-4a5d-8af2-2f4029f87b29","Type":"ContainerStarted","Data":"6cc96e768c60d5d61941fbd88f131fd7004d679b67aac20a4fad0aa730efb326"} Apr 17 14:13:06.973736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:06.973270 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" Apr 17 14:13:06.987743 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:06.987697 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" podStartSLOduration=1.253585154 podStartE2EDuration="3.987683117s" podCreationTimestamp="2026-04-17 14:13:03 +0000 UTC" firstStartedPulling="2026-04-17 14:13:03.686395956 +0000 UTC m=+327.322973324" lastFinishedPulling="2026-04-17 14:13:06.420493915 +0000 UTC m=+330.057071287" observedRunningTime="2026-04-17 14:13:06.986591647 +0000 UTC m=+330.623169037" watchObservedRunningTime="2026-04-17 14:13:06.987683117 +0000 UTC m=+330.624260507" Apr 17 14:13:12.979423 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:12.979394 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-n8jc2" Apr 17 14:13:18.940693 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:18.940612 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56"] Apr 17 14:13:18.944095 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:18.944077 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:18.946222 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:18.946198 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:13:18.946343 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:18.946303 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:13:18.947019 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:18.946999 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hq8kl\"" Apr 17 14:13:18.950202 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:18.950177 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56"] Apr 17 14:13:18.979624 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:18.979587 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:18.979769 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:18.979649 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:18.979769 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:18.979684 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-982sv\" (UniqueName: \"kubernetes.io/projected/1300c4ee-36bc-41f8-af61-945391f28929-kube-api-access-982sv\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:19.080657 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:19.080619 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:19.080844 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:19.080699 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:19.080844 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:19.080753 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-982sv\" (UniqueName: \"kubernetes.io/projected/1300c4ee-36bc-41f8-af61-945391f28929-kube-api-access-982sv\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:19.081038 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:19.081019 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:19.081105 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:19.081070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:19.088655 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:19.088623 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-982sv\" (UniqueName: \"kubernetes.io/projected/1300c4ee-36bc-41f8-af61-945391f28929-kube-api-access-982sv\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:19.254906 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:19.254814 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:19.370470 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:19.370368 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56"] Apr 17 14:13:19.373756 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:13:19.373712 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1300c4ee_36bc_41f8_af61_945391f28929.slice/crio-87f58f6c31d102748e605fb77abbebcb4d39b7366bc6676f9a73c3fa9de951df WatchSource:0}: Error finding container 87f58f6c31d102748e605fb77abbebcb4d39b7366bc6676f9a73c3fa9de951df: Status 404 returned error can't find the container with id 87f58f6c31d102748e605fb77abbebcb4d39b7366bc6676f9a73c3fa9de951df Apr 17 14:13:20.014457 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:20.014423 2568 generic.go:358] "Generic (PLEG): container finished" podID="1300c4ee-36bc-41f8-af61-945391f28929" containerID="10c183ef1c4ed90dd6c707aff6721d5f7bd0dbf6119d22d4b54ade3b5e696cdc" exitCode=0 Apr 17 14:13:20.014873 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:20.014475 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" event={"ID":"1300c4ee-36bc-41f8-af61-945391f28929","Type":"ContainerDied","Data":"10c183ef1c4ed90dd6c707aff6721d5f7bd0dbf6119d22d4b54ade3b5e696cdc"} Apr 17 14:13:20.014873 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:20.014496 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" event={"ID":"1300c4ee-36bc-41f8-af61-945391f28929","Type":"ContainerStarted","Data":"87f58f6c31d102748e605fb77abbebcb4d39b7366bc6676f9a73c3fa9de951df"} Apr 17 14:13:23.025614 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:23.025579 2568 generic.go:358] "Generic (PLEG): container finished" podID="1300c4ee-36bc-41f8-af61-945391f28929" containerID="1a64075154d942f071055f4eca265b99ab7c07cc48774fd58d8b8d38dc738f96" exitCode=0 Apr 17 14:13:23.025996 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:23.025664 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" event={"ID":"1300c4ee-36bc-41f8-af61-945391f28929","Type":"ContainerDied","Data":"1a64075154d942f071055f4eca265b99ab7c07cc48774fd58d8b8d38dc738f96"} Apr 17 14:13:24.031156 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:24.031123 2568 generic.go:358] "Generic (PLEG): container finished" podID="1300c4ee-36bc-41f8-af61-945391f28929" containerID="1ef5816509f01befbae10896667909611a4aa239a93066cb5761f0f6c18b07cd" exitCode=0 Apr 17 14:13:24.031551 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:24.031214 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" event={"ID":"1300c4ee-36bc-41f8-af61-945391f28929","Type":"ContainerDied","Data":"1ef5816509f01befbae10896667909611a4aa239a93066cb5761f0f6c18b07cd"} Apr 17 14:13:25.154317 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:25.154293 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:25.231145 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:25.231111 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-bundle\") pod \"1300c4ee-36bc-41f8-af61-945391f28929\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " Apr 17 14:13:25.231145 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:25.231147 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-util\") pod \"1300c4ee-36bc-41f8-af61-945391f28929\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " Apr 17 14:13:25.231363 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:25.231165 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-982sv\" (UniqueName: \"kubernetes.io/projected/1300c4ee-36bc-41f8-af61-945391f28929-kube-api-access-982sv\") pod \"1300c4ee-36bc-41f8-af61-945391f28929\" (UID: \"1300c4ee-36bc-41f8-af61-945391f28929\") " Apr 17 14:13:25.231500 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:25.231477 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-bundle" (OuterVolumeSpecName: "bundle") pod "1300c4ee-36bc-41f8-af61-945391f28929" (UID: "1300c4ee-36bc-41f8-af61-945391f28929"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:13:25.233242 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:25.233222 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1300c4ee-36bc-41f8-af61-945391f28929-kube-api-access-982sv" (OuterVolumeSpecName: "kube-api-access-982sv") pod "1300c4ee-36bc-41f8-af61-945391f28929" (UID: "1300c4ee-36bc-41f8-af61-945391f28929"). InnerVolumeSpecName "kube-api-access-982sv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:13:25.332635 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:25.332561 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-bundle\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:13:25.332635 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:25.332589 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-982sv\" (UniqueName: \"kubernetes.io/projected/1300c4ee-36bc-41f8-af61-945391f28929-kube-api-access-982sv\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:13:25.700323 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:25.700272 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-util" (OuterVolumeSpecName: "util") pod "1300c4ee-36bc-41f8-af61-945391f28929" (UID: "1300c4ee-36bc-41f8-af61-945391f28929"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:13:25.734776 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:25.734743 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1300c4ee-36bc-41f8-af61-945391f28929-util\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:13:26.038613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:26.038530 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" event={"ID":"1300c4ee-36bc-41f8-af61-945391f28929","Type":"ContainerDied","Data":"87f58f6c31d102748e605fb77abbebcb4d39b7366bc6676f9a73c3fa9de951df"} Apr 17 14:13:26.038613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:26.038569 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87f58f6c31d102748e605fb77abbebcb4d39b7366bc6676f9a73c3fa9de951df" Apr 17 14:13:26.038613 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:26.038590 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ekxd56" Apr 17 14:13:31.678574 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.678536 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp"] Apr 17 14:13:31.678964 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.678848 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1300c4ee-36bc-41f8-af61-945391f28929" containerName="extract" Apr 17 14:13:31.678964 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.678859 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1300c4ee-36bc-41f8-af61-945391f28929" containerName="extract" Apr 17 14:13:31.678964 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.678870 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1300c4ee-36bc-41f8-af61-945391f28929" containerName="util" Apr 17 14:13:31.678964 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.678875 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1300c4ee-36bc-41f8-af61-945391f28929" containerName="util" Apr 17 14:13:31.678964 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.678883 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1300c4ee-36bc-41f8-af61-945391f28929" containerName="pull" Apr 17 14:13:31.678964 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.678889 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1300c4ee-36bc-41f8-af61-945391f28929" containerName="pull" Apr 17 14:13:31.678964 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.678933 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1300c4ee-36bc-41f8-af61-945391f28929" containerName="extract" Apr 17 14:13:31.685609 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.685563 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp" Apr 17 14:13:31.687763 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.687741 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:13:31.687903 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.687815 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 17 14:13:31.688291 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.688272 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp"] Apr 17 14:13:31.688543 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.688490 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-x7mr4\"" Apr 17 14:13:31.786790 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.786754 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6vzx\" (UniqueName: \"kubernetes.io/projected/4cd9e279-9b6d-42f8-b0f6-7d9b814049d0-kube-api-access-h6vzx\") pod \"jobset-operator-747c5859c7-jj2xp\" (UID: \"4cd9e279-9b6d-42f8-b0f6-7d9b814049d0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp" Apr 17 14:13:31.786973 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.786799 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4cd9e279-9b6d-42f8-b0f6-7d9b814049d0-tmp\") pod \"jobset-operator-747c5859c7-jj2xp\" (UID: \"4cd9e279-9b6d-42f8-b0f6-7d9b814049d0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp" Apr 17 14:13:31.887929 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.887894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6vzx\" (UniqueName: \"kubernetes.io/projected/4cd9e279-9b6d-42f8-b0f6-7d9b814049d0-kube-api-access-h6vzx\") pod \"jobset-operator-747c5859c7-jj2xp\" (UID: \"4cd9e279-9b6d-42f8-b0f6-7d9b814049d0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp" Apr 17 14:13:31.888114 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.887939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4cd9e279-9b6d-42f8-b0f6-7d9b814049d0-tmp\") pod \"jobset-operator-747c5859c7-jj2xp\" (UID: \"4cd9e279-9b6d-42f8-b0f6-7d9b814049d0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp" Apr 17 14:13:31.888288 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.888271 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4cd9e279-9b6d-42f8-b0f6-7d9b814049d0-tmp\") pod \"jobset-operator-747c5859c7-jj2xp\" (UID: \"4cd9e279-9b6d-42f8-b0f6-7d9b814049d0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp" Apr 17 14:13:31.895595 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.895568 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6vzx\" (UniqueName: \"kubernetes.io/projected/4cd9e279-9b6d-42f8-b0f6-7d9b814049d0-kube-api-access-h6vzx\") pod \"jobset-operator-747c5859c7-jj2xp\" (UID: \"4cd9e279-9b6d-42f8-b0f6-7d9b814049d0\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp" Apr 17 14:13:31.995313 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:31.995227 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp" Apr 17 14:13:32.114411 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:32.114388 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp"] Apr 17 14:13:32.116943 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:13:32.116918 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cd9e279_9b6d_42f8_b0f6_7d9b814049d0.slice/crio-7bcb94f3dc89d035f7d464deb030f45f5ea75aca5d8baf1f1b805c64c6acf928 WatchSource:0}: Error finding container 7bcb94f3dc89d035f7d464deb030f45f5ea75aca5d8baf1f1b805c64c6acf928: Status 404 returned error can't find the container with id 7bcb94f3dc89d035f7d464deb030f45f5ea75aca5d8baf1f1b805c64c6acf928 Apr 17 14:13:33.063580 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:33.063536 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp" event={"ID":"4cd9e279-9b6d-42f8-b0f6-7d9b814049d0","Type":"ContainerStarted","Data":"7bcb94f3dc89d035f7d464deb030f45f5ea75aca5d8baf1f1b805c64c6acf928"} Apr 17 14:13:34.071590 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:34.071497 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp" event={"ID":"4cd9e279-9b6d-42f8-b0f6-7d9b814049d0","Type":"ContainerStarted","Data":"b66b48850c5748f739bfa5cc61024dce33597586c52dab0fc383de74cf328334"} Apr 17 14:13:34.086960 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:34.086916 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-jj2xp" podStartSLOduration=1.211881908 podStartE2EDuration="3.086902008s" podCreationTimestamp="2026-04-17 14:13:31 +0000 UTC" firstStartedPulling="2026-04-17 14:13:32.118241177 +0000 UTC m=+355.754818544" lastFinishedPulling="2026-04-17 14:13:33.993261276 +0000 UTC m=+357.629838644" observedRunningTime="2026-04-17 14:13:34.08510203 +0000 UTC m=+357.721679423" watchObservedRunningTime="2026-04-17 14:13:34.086902008 +0000 UTC m=+357.723479397" Apr 17 14:13:52.527437 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.527398 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9"] Apr 17 14:13:52.529672 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.529653 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.536844 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.533098 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 17 14:13:52.536844 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.533376 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 17 14:13:52.536844 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.533621 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-sx2vm\"" Apr 17 14:13:52.536844 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.533628 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 17 14:13:52.538796 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.538770 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9"] Apr 17 14:13:52.663543 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.663475 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-metrics-certs\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.663714 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.663570 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwfn7\" (UniqueName: \"kubernetes.io/projected/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-kube-api-access-bwfn7\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.663714 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.663670 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-cert\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.663714 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.663703 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-manager-config\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.764754 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.764717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-cert\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.764754 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.764761 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-manager-config\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.764973 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.764814 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-metrics-certs\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.764973 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.764835 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwfn7\" (UniqueName: \"kubernetes.io/projected/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-kube-api-access-bwfn7\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.765531 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.765485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-manager-config\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.767409 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.767383 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-cert\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.767547 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.767437 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-metrics-certs\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.772575 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.772553 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwfn7\" (UniqueName: \"kubernetes.io/projected/fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4-kube-api-access-bwfn7\") pod \"jobset-controller-manager-656b757fbd-c69b9\" (UID: \"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4\") " pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.842111 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.842018 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:13:52.983103 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:52.983079 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9"] Apr 17 14:13:52.985454 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:13:52.985421 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe4d002e_fc5c_4e1e_aca8_f9a225dec8d4.slice/crio-f2f88a84ec530e3777d3130bfcc62232c66ed3b65ae3c724c7750558e9a4c22b WatchSource:0}: Error finding container f2f88a84ec530e3777d3130bfcc62232c66ed3b65ae3c724c7750558e9a4c22b: Status 404 returned error can't find the container with id f2f88a84ec530e3777d3130bfcc62232c66ed3b65ae3c724c7750558e9a4c22b Apr 17 14:13:53.134519 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:13:53.134422 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" event={"ID":"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4","Type":"ContainerStarted","Data":"f2f88a84ec530e3777d3130bfcc62232c66ed3b65ae3c724c7750558e9a4c22b"} Apr 17 14:14:03.190379 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:14:03.190336 2568 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/job-set/jobset-rhel9@sha256:8a0ce916ed17d4244f97ee967d341532365cbab4b4287639509dee914f50c8a1: reading manifest sha256:8a0ce916ed17d4244f97ee967d341532365cbab4b4287639509dee914f50c8a1 in registry.redhat.io/job-set/jobset-rhel9: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" image="registry.redhat.io/job-set/jobset-rhel9@sha256:8a0ce916ed17d4244f97ee967d341532365cbab4b4287639509dee914f50c8a1" Apr 17 14:14:03.190828 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:14:03.190615 2568 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:registry.redhat.io/job-set/jobset-rhel9@sha256:8a0ce916ed17d4244f97ee967d341532365cbab4b4287639509dee914f50c8a1,Command:[/manager],Args:[--config=/controller_manager_config.yaml --zap-log-level=info],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:metrics-certs,ReadOnly:true,MountPath:/tmp/k8s-metrics-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:manager-config,ReadOnly:false,MountPath:/controller_manager_config.yaml,SubPath:controller_manager_config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bwfn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod jobset-controller-manager-656b757fbd-c69b9_openshift-jobset-operator(fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/job-set/jobset-rhel9@sha256:8a0ce916ed17d4244f97ee967d341532365cbab4b4287639509dee914f50c8a1: reading manifest sha256:8a0ce916ed17d4244f97ee967d341532365cbab4b4287639509dee914f50c8a1 in registry.redhat.io/job-set/jobset-rhel9: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 14:14:03.191827 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:14:03.191798 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/job-set/jobset-rhel9@sha256:8a0ce916ed17d4244f97ee967d341532365cbab4b4287639509dee914f50c8a1: reading manifest sha256:8a0ce916ed17d4244f97ee967d341532365cbab4b4287639509dee914f50c8a1 in registry.redhat.io/job-set/jobset-rhel9: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" podUID="fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4" Apr 17 14:14:04.176673 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:14:04.176642 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/job-set/jobset-rhel9@sha256:8a0ce916ed17d4244f97ee967d341532365cbab4b4287639509dee914f50c8a1\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/job-set/jobset-rhel9@sha256:8a0ce916ed17d4244f97ee967d341532365cbab4b4287639509dee914f50c8a1: reading manifest sha256:8a0ce916ed17d4244f97ee967d341532365cbab4b4287639509dee914f50c8a1 in registry.redhat.io/job-set/jobset-rhel9: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" podUID="fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4" Apr 17 14:14:22.242462 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:14:22.242428 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" event={"ID":"fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4","Type":"ContainerStarted","Data":"fff9dbc2acc3685505b0e179942831a5ac2a3bb7494fb5efbe1dd8a9aebd7347"} Apr 17 14:14:22.242894 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:14:22.242647 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:14:22.258160 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:14:22.258115 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" podStartSLOduration=1.968438603 podStartE2EDuration="30.258103063s" podCreationTimestamp="2026-04-17 14:13:52 +0000 UTC" firstStartedPulling="2026-04-17 14:13:52.987302906 +0000 UTC m=+376.623880274" lastFinishedPulling="2026-04-17 14:14:21.276967363 +0000 UTC m=+404.913544734" observedRunningTime="2026-04-17 14:14:22.256681768 +0000 UTC m=+405.893259157" watchObservedRunningTime="2026-04-17 14:14:22.258103063 +0000 UTC m=+405.894680454" Apr 17 14:14:33.251261 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:14:33.251224 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-656b757fbd-c69b9" Apr 17 14:15:27.269258 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.269220 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78c89df799-ff8q5"] Apr 17 14:15:27.272628 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.272607 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.284306 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.284282 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78c89df799-ff8q5"] Apr 17 14:15:27.365235 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.365202 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec2e6bd0-96e5-40ea-8be3-123499945f1c-console-serving-cert\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.365393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.365242 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec2e6bd0-96e5-40ea-8be3-123499945f1c-console-oauth-config\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.365393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.365269 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-console-config\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.365393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.365337 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-service-ca\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.365393 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.365362 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-oauth-serving-cert\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.365558 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.365403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzcnp\" (UniqueName: \"kubernetes.io/projected/ec2e6bd0-96e5-40ea-8be3-123499945f1c-kube-api-access-pzcnp\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.365558 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.365433 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-trusted-ca-bundle\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.466788 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.466760 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec2e6bd0-96e5-40ea-8be3-123499945f1c-console-serving-cert\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.466943 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.466801 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec2e6bd0-96e5-40ea-8be3-123499945f1c-console-oauth-config\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.466943 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.466925 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-console-config\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.467037 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.466970 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-service-ca\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.467037 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.466985 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-oauth-serving-cert\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.467037 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.467023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzcnp\" (UniqueName: \"kubernetes.io/projected/ec2e6bd0-96e5-40ea-8be3-123499945f1c-kube-api-access-pzcnp\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.467176 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.467048 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-trusted-ca-bundle\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.467645 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.467621 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-console-config\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.467763 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.467664 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-service-ca\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.467827 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.467805 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-oauth-serving-cert\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.467891 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.467876 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2e6bd0-96e5-40ea-8be3-123499945f1c-trusted-ca-bundle\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.469840 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.469813 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec2e6bd0-96e5-40ea-8be3-123499945f1c-console-oauth-config\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.469957 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.469937 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec2e6bd0-96e5-40ea-8be3-123499945f1c-console-serving-cert\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.473980 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.473956 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzcnp\" (UniqueName: \"kubernetes.io/projected/ec2e6bd0-96e5-40ea-8be3-123499945f1c-kube-api-access-pzcnp\") pod \"console-78c89df799-ff8q5\" (UID: \"ec2e6bd0-96e5-40ea-8be3-123499945f1c\") " pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.581708 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.581633 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:27.700049 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:27.700026 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78c89df799-ff8q5"] Apr 17 14:15:27.702539 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:15:27.702497 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec2e6bd0_96e5_40ea_8be3_123499945f1c.slice/crio-1452ffb2927004642359d566ef5066742e54fedcefcb3346a55ae58b41988180 WatchSource:0}: Error finding container 1452ffb2927004642359d566ef5066742e54fedcefcb3346a55ae58b41988180: Status 404 returned error can't find the container with id 1452ffb2927004642359d566ef5066742e54fedcefcb3346a55ae58b41988180 Apr 17 14:15:28.468328 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:28.468294 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78c89df799-ff8q5" event={"ID":"ec2e6bd0-96e5-40ea-8be3-123499945f1c","Type":"ContainerStarted","Data":"f44a42130b62f04fb86e39c83f0e8f33c82f1ede0bcc84cfb8e961b69cd15ecd"} Apr 17 14:15:28.468328 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:28.468329 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78c89df799-ff8q5" event={"ID":"ec2e6bd0-96e5-40ea-8be3-123499945f1c","Type":"ContainerStarted","Data":"1452ffb2927004642359d566ef5066742e54fedcefcb3346a55ae58b41988180"} Apr 17 14:15:28.485977 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:28.485928 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78c89df799-ff8q5" podStartSLOduration=1.485912936 podStartE2EDuration="1.485912936s" podCreationTimestamp="2026-04-17 14:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:15:28.483346878 +0000 UTC m=+472.119924269" watchObservedRunningTime="2026-04-17 14:15:28.485912936 +0000 UTC m=+472.122490325" Apr 17 14:15:37.582472 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:37.582414 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:37.582472 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:37.582476 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:37.587362 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:37.587333 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:38.505774 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:38.505745 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78c89df799-ff8q5" Apr 17 14:15:38.547359 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:15:38.547326 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-868dd77dc-jq584"] Apr 17 14:16:03.568535 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.568464 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-868dd77dc-jq584" podUID="b5da1956-b0d8-4bdf-a5a5-abb04571309c" containerName="console" containerID="cri-o://7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8" gracePeriod=15 Apr 17 14:16:03.814058 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.814028 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-868dd77dc-jq584_b5da1956-b0d8-4bdf-a5a5-abb04571309c/console/0.log" Apr 17 14:16:03.814190 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.814099 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:16:03.880827 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.880731 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjwz9\" (UniqueName: \"kubernetes.io/projected/b5da1956-b0d8-4bdf-a5a5-abb04571309c-kube-api-access-cjwz9\") pod \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " Apr 17 14:16:03.880827 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.880778 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-trusted-ca-bundle\") pod \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " Apr 17 14:16:03.880827 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.880831 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-oauth-serving-cert\") pod \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " Apr 17 14:16:03.881091 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.880857 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-serving-cert\") pod \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " Apr 17 14:16:03.881091 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.880880 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-service-ca\") pod \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " Apr 17 14:16:03.881091 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.881035 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-config\") pod \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " Apr 17 14:16:03.881232 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.881090 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-oauth-config\") pod \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\" (UID: \"b5da1956-b0d8-4bdf-a5a5-abb04571309c\") " Apr 17 14:16:03.881282 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.881251 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-service-ca" (OuterVolumeSpecName: "service-ca") pod "b5da1956-b0d8-4bdf-a5a5-abb04571309c" (UID: "b5da1956-b0d8-4bdf-a5a5-abb04571309c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:16:03.881331 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.881294 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b5da1956-b0d8-4bdf-a5a5-abb04571309c" (UID: "b5da1956-b0d8-4bdf-a5a5-abb04571309c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:16:03.881381 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.881321 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b5da1956-b0d8-4bdf-a5a5-abb04571309c" (UID: "b5da1956-b0d8-4bdf-a5a5-abb04571309c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:16:03.881430 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.881375 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-config" (OuterVolumeSpecName: "console-config") pod "b5da1956-b0d8-4bdf-a5a5-abb04571309c" (UID: "b5da1956-b0d8-4bdf-a5a5-abb04571309c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:16:03.881473 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.881455 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-oauth-serving-cert\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:16:03.881527 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.881475 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-service-ca\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:16:03.881527 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.881491 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-config\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:16:03.881612 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.881532 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5da1956-b0d8-4bdf-a5a5-abb04571309c-trusted-ca-bundle\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:16:03.883153 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.883129 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b5da1956-b0d8-4bdf-a5a5-abb04571309c" (UID: "b5da1956-b0d8-4bdf-a5a5-abb04571309c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:16:03.883278 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.883165 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5da1956-b0d8-4bdf-a5a5-abb04571309c-kube-api-access-cjwz9" (OuterVolumeSpecName: "kube-api-access-cjwz9") pod "b5da1956-b0d8-4bdf-a5a5-abb04571309c" (UID: "b5da1956-b0d8-4bdf-a5a5-abb04571309c"). InnerVolumeSpecName "kube-api-access-cjwz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:16:03.883318 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.883273 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b5da1956-b0d8-4bdf-a5a5-abb04571309c" (UID: "b5da1956-b0d8-4bdf-a5a5-abb04571309c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:16:03.982736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.982689 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cjwz9\" (UniqueName: \"kubernetes.io/projected/b5da1956-b0d8-4bdf-a5a5-abb04571309c-kube-api-access-cjwz9\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:16:03.982736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.982732 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-serving-cert\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:16:03.982736 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:03.982743 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b5da1956-b0d8-4bdf-a5a5-abb04571309c-console-oauth-config\") on node \"ip-10-0-138-158.ec2.internal\" DevicePath \"\"" Apr 17 14:16:04.592756 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:04.592728 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-868dd77dc-jq584_b5da1956-b0d8-4bdf-a5a5-abb04571309c/console/0.log" Apr 17 14:16:04.593148 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:04.592767 2568 generic.go:358] "Generic (PLEG): container finished" podID="b5da1956-b0d8-4bdf-a5a5-abb04571309c" containerID="7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8" exitCode=2 Apr 17 14:16:04.593148 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:04.592801 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868dd77dc-jq584" event={"ID":"b5da1956-b0d8-4bdf-a5a5-abb04571309c","Type":"ContainerDied","Data":"7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8"} Apr 17 14:16:04.593148 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:04.592834 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868dd77dc-jq584" Apr 17 14:16:04.593148 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:04.592846 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868dd77dc-jq584" event={"ID":"b5da1956-b0d8-4bdf-a5a5-abb04571309c","Type":"ContainerDied","Data":"ae9fce0b8315f0c3847c18a7970b6dba9703455692e4b30a05fc80e8d258dc0c"} Apr 17 14:16:04.593148 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:04.592863 2568 scope.go:117] "RemoveContainer" containerID="7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8" Apr 17 14:16:04.601758 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:04.601742 2568 scope.go:117] "RemoveContainer" containerID="7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8" Apr 17 14:16:04.602018 ip-10-0-138-158 kubenswrapper[2568]: E0417 14:16:04.602001 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8\": container with ID starting with 7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8 not found: ID does not exist" containerID="7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8" Apr 17 14:16:04.602096 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:04.602028 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8"} err="failed to get container status \"7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8\": rpc error: code = NotFound desc = could not find container \"7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8\": container with ID starting with 7b2f7de0ebab6232ebbd2a2d3492ee58ac8a135c03cebf5bde492fc55d190ca8 not found: ID does not exist" Apr 17 14:16:04.613033 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:04.613010 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-868dd77dc-jq584"] Apr 17 14:16:04.616351 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:04.616329 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-868dd77dc-jq584"] Apr 17 14:16:04.950039 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:16:04.950006 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5da1956-b0d8-4bdf-a5a5-abb04571309c" path="/var/lib/kubelet/pods/b5da1956-b0d8-4bdf-a5a5-abb04571309c/volumes" Apr 17 14:56:59.177997 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:56:59.177924 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-c2rcv_ff15f2af-05a6-4805-9ff7-0b8f823a3c0a/global-pull-secret-syncer/0.log" Apr 17 14:56:59.345672 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:56:59.345647 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5mtb9_cb936c0a-ae1f-4ae8-825e-afab50630fa3/konnectivity-agent/0.log" Apr 17 14:56:59.365864 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:56:59.365830 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-158.ec2.internal_68c692e25ca4376cc2fb31a74dcd7849/haproxy/0.log" Apr 17 14:57:02.546654 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.546621 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a08c066-6613-4a23-8e09-6aea0328fb2b/alertmanager/0.log" Apr 17 14:57:02.570003 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.569978 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a08c066-6613-4a23-8e09-6aea0328fb2b/config-reloader/0.log" Apr 17 14:57:02.588634 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.588610 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a08c066-6613-4a23-8e09-6aea0328fb2b/kube-rbac-proxy-web/0.log" Apr 17 14:57:02.609458 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.609444 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a08c066-6613-4a23-8e09-6aea0328fb2b/kube-rbac-proxy/0.log" Apr 17 14:57:02.632908 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.632880 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a08c066-6613-4a23-8e09-6aea0328fb2b/kube-rbac-proxy-metric/0.log" Apr 17 14:57:02.651812 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.651790 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a08c066-6613-4a23-8e09-6aea0328fb2b/prom-label-proxy/0.log" Apr 17 14:57:02.670567 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.670546 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1a08c066-6613-4a23-8e09-6aea0328fb2b/init-config-reloader/0.log" Apr 17 14:57:02.739120 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.739095 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pfd9b_51b1465e-8a5a-410d-8ece-f7d239a13616/kube-state-metrics/0.log" Apr 17 14:57:02.756963 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.756942 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pfd9b_51b1465e-8a5a-410d-8ece-f7d239a13616/kube-rbac-proxy-main/0.log" Apr 17 14:57:02.775451 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.775430 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pfd9b_51b1465e-8a5a-410d-8ece-f7d239a13616/kube-rbac-proxy-self/0.log" Apr 17 14:57:02.852055 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.852000 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmsjj_af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00/node-exporter/0.log" Apr 17 14:57:02.869524 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.869487 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmsjj_af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00/kube-rbac-proxy/0.log" Apr 17 14:57:02.889325 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:02.889307 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmsjj_af0f5b71-a76d-42ef-8cbe-2e3f8a5aed00/init-textfile/0.log" Apr 17 14:57:03.350679 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:03.350652 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-nh4td_e7f4ed3e-b7f5-441d-8c8a-22a8c99ecedf/prometheus-operator-admission-webhook/0.log" Apr 17 14:57:03.377794 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:03.377766 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-58d99675d9-h77tc_ac42def7-0611-4dfc-9d27-f63a377b1901/telemeter-client/0.log" Apr 17 14:57:03.396003 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:03.395976 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-58d99675d9-h77tc_ac42def7-0611-4dfc-9d27-f63a377b1901/reload/0.log" Apr 17 14:57:03.414677 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:03.414648 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-58d99675d9-h77tc_ac42def7-0611-4dfc-9d27-f63a377b1901/kube-rbac-proxy/0.log" Apr 17 14:57:05.480334 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:05.480308 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78c89df799-ff8q5_ec2e6bd0-96e5-40ea-8be3-123499945f1c/console/0.log" Apr 17 14:57:05.514855 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:05.514826 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-jb426_2303c9e7-fe82-4eab-9edd-fd7c86291690/download-server/0.log" Apr 17 14:57:06.619798 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:06.619766 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jscrk_2742c3d7-a7f7-4525-b1be-30a78e5cec2f/dns/0.log" Apr 17 14:57:06.637436 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:06.637412 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jscrk_2742c3d7-a7f7-4525-b1be-30a78e5cec2f/kube-rbac-proxy/0.log" Apr 17 14:57:06.737378 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:06.737344 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ztcfx_44d13d23-0ead-4ceb-b841-467a36463db2/dns-node-resolver/0.log" Apr 17 14:57:07.123042 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.123005 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-76bbf49c4d-nw9lh_f7c2a2bb-53b7-4cf4-bec4-e515ccc57619/registry/0.log" Apr 17 14:57:07.184151 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.184111 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mq44v_65df7e4a-6219-433f-b614-258be054188a/node-ca/0.log" Apr 17 14:57:07.244280 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.244253 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q"] Apr 17 14:57:07.244654 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.244630 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5da1956-b0d8-4bdf-a5a5-abb04571309c" containerName="console" Apr 17 14:57:07.244792 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.244650 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5da1956-b0d8-4bdf-a5a5-abb04571309c" containerName="console" Apr 17 14:57:07.244792 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.244774 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5da1956-b0d8-4bdf-a5a5-abb04571309c" containerName="console" Apr 17 14:57:07.247715 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.247699 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.249821 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.249806 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wh6bh\"/\"default-dockercfg-g56sz\"" Apr 17 14:57:07.249903 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.249838 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wh6bh\"/\"kube-root-ca.crt\"" Apr 17 14:57:07.250670 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.250655 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wh6bh\"/\"openshift-service-ca.crt\"" Apr 17 14:57:07.256074 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.256054 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q"] Apr 17 14:57:07.280442 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.280419 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-proc\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.280553 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.280447 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-lib-modules\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.280603 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.280550 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-sys\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.280603 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.280580 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-podres\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.280669 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.280610 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx6jb\" (UniqueName: \"kubernetes.io/projected/eb3b58f4-68d7-4a29-9485-6bca62d707e4-kube-api-access-bx6jb\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.381727 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.381648 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-proc\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.381727 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.381690 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-lib-modules\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.381891 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.381752 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-proc\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.381891 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.381754 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-sys\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.381891 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.381806 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-sys\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.381891 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.381841 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-podres\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.381891 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.381858 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-lib-modules\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.382047 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.381901 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx6jb\" (UniqueName: \"kubernetes.io/projected/eb3b58f4-68d7-4a29-9485-6bca62d707e4-kube-api-access-bx6jb\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.382047 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.381958 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eb3b58f4-68d7-4a29-9485-6bca62d707e4-podres\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.389591 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.389560 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx6jb\" (UniqueName: \"kubernetes.io/projected/eb3b58f4-68d7-4a29-9485-6bca62d707e4-kube-api-access-bx6jb\") pod \"perf-node-gather-daemonset-rkv4q\" (UID: \"eb3b58f4-68d7-4a29-9485-6bca62d707e4\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.558781 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.558748 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:07.675743 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.675704 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q"] Apr 17 14:57:07.678931 ip-10-0-138-158 kubenswrapper[2568]: W0417 14:57:07.678905 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podeb3b58f4_68d7_4a29_9485_6bca62d707e4.slice/crio-c50c5cbe92138e27f488db7aabdc2474870570c9eb5d056b329caec98a0c09e4 WatchSource:0}: Error finding container c50c5cbe92138e27f488db7aabdc2474870570c9eb5d056b329caec98a0c09e4: Status 404 returned error can't find the container with id c50c5cbe92138e27f488db7aabdc2474870570c9eb5d056b329caec98a0c09e4 Apr 17 14:57:07.680458 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.680441 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:57:07.717803 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.717777 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" event={"ID":"eb3b58f4-68d7-4a29-9485-6bca62d707e4","Type":"ContainerStarted","Data":"c50c5cbe92138e27f488db7aabdc2474870570c9eb5d056b329caec98a0c09e4"} Apr 17 14:57:07.876284 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:07.876258 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-684d7b5578-8dvb5_99c53420-8717-4539-b474-bdf6d9f5615a/router/0.log" Apr 17 14:57:08.225387 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:08.225358 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qf25s_b7c15494-fd82-415f-967e-b8bf2220ef8a/serve-healthcheck-canary/0.log" Apr 17 14:57:08.559738 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:08.559654 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-rktb8_61e203fc-fb18-4d79-ace2-a4218ed4e9d9/insights-operator/0.log" Apr 17 14:57:08.560600 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:08.560577 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-rktb8_61e203fc-fb18-4d79-ace2-a4218ed4e9d9/insights-operator/1.log" Apr 17 14:57:08.635295 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:08.635268 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gtrf5_2e8ddfff-998b-421c-9ea9-12e88bb8506b/kube-rbac-proxy/0.log" Apr 17 14:57:08.651632 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:08.651611 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gtrf5_2e8ddfff-998b-421c-9ea9-12e88bb8506b/exporter/0.log" Apr 17 14:57:08.669476 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:08.669459 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gtrf5_2e8ddfff-998b-421c-9ea9-12e88bb8506b/extractor/0.log" Apr 17 14:57:08.722080 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:08.722059 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" event={"ID":"eb3b58f4-68d7-4a29-9485-6bca62d707e4","Type":"ContainerStarted","Data":"943829cfee8a4b47388251caaf74e824d9c62af1eab4cde5fdb379c24a901e19"} Apr 17 14:57:08.722392 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:08.722099 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:08.735553 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:08.735485 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" podStartSLOduration=1.735471282 podStartE2EDuration="1.735471282s" podCreationTimestamp="2026-04-17 14:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:57:08.735294872 +0000 UTC m=+2972.371872299" watchObservedRunningTime="2026-04-17 14:57:08.735471282 +0000 UTC m=+2972.372048677" Apr 17 14:57:10.269084 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:10.269058 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-656b757fbd-c69b9_fe4d002e-fc5c-4e1e-aca8-f9a225dec8d4/manager/0.log" Apr 17 14:57:10.291314 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:10.291293 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-jj2xp_4cd9e279-9b6d-42f8-b0f6-7d9b814049d0/jobset-operator/0.log" Apr 17 14:57:13.412716 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:13.412679 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-67kpd_07693eac-becc-4d74-9e6b-24a018ef1f41/kube-storage-version-migrator-operator/1.log" Apr 17 14:57:13.414289 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:13.414265 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-67kpd_07693eac-becc-4d74-9e6b-24a018ef1f41/kube-storage-version-migrator-operator/0.log" Apr 17 14:57:14.359945 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:14.359914 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-778wr_79d360cf-60bc-4bbe-ab0a-2832dd974cde/kube-multus-additional-cni-plugins/0.log" Apr 17 14:57:14.377828 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:14.377801 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-778wr_79d360cf-60bc-4bbe-ab0a-2832dd974cde/egress-router-binary-copy/0.log" Apr 17 14:57:14.397612 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:14.397586 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-778wr_79d360cf-60bc-4bbe-ab0a-2832dd974cde/cni-plugins/0.log" Apr 17 14:57:14.414779 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:14.414755 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-778wr_79d360cf-60bc-4bbe-ab0a-2832dd974cde/bond-cni-plugin/0.log" Apr 17 14:57:14.434147 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:14.434124 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-778wr_79d360cf-60bc-4bbe-ab0a-2832dd974cde/routeoverride-cni/0.log" Apr 17 14:57:14.451171 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:14.451154 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-778wr_79d360cf-60bc-4bbe-ab0a-2832dd974cde/whereabouts-cni-bincopy/0.log" Apr 17 14:57:14.468550 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:14.468523 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-778wr_79d360cf-60bc-4bbe-ab0a-2832dd974cde/whereabouts-cni/0.log" Apr 17 14:57:14.648491 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:14.648414 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmjrq_786788f0-7365-4b9c-9628-78838c53bc50/kube-multus/0.log" Apr 17 14:57:14.735124 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:14.735097 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-rkv4q" Apr 17 14:57:14.758721 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:14.758695 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f6d89_497bbc82-edab-4d97-bcc8-7d428e62da1e/network-metrics-daemon/0.log" Apr 17 14:57:14.775608 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:14.775589 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f6d89_497bbc82-edab-4d97-bcc8-7d428e62da1e/kube-rbac-proxy/0.log" Apr 17 14:57:15.830858 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:15.830829 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brxr6_dccd1bed-f8d5-4c16-968b-e828fa6150a1/ovn-controller/0.log" Apr 17 14:57:15.870923 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:15.870892 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brxr6_dccd1bed-f8d5-4c16-968b-e828fa6150a1/ovn-acl-logging/0.log" Apr 17 14:57:15.890365 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:15.890337 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brxr6_dccd1bed-f8d5-4c16-968b-e828fa6150a1/kube-rbac-proxy-node/0.log" Apr 17 14:57:15.909317 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:15.909294 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brxr6_dccd1bed-f8d5-4c16-968b-e828fa6150a1/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 14:57:15.926519 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:15.926489 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brxr6_dccd1bed-f8d5-4c16-968b-e828fa6150a1/northd/0.log" Apr 17 14:57:15.945766 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:15.945747 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brxr6_dccd1bed-f8d5-4c16-968b-e828fa6150a1/nbdb/0.log" Apr 17 14:57:15.966596 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:15.966533 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brxr6_dccd1bed-f8d5-4c16-968b-e828fa6150a1/sbdb/0.log" Apr 17 14:57:16.121687 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:16.121657 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-brxr6_dccd1bed-f8d5-4c16-968b-e828fa6150a1/ovnkube-controller/0.log" Apr 17 14:57:17.597775 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:17.597666 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-rcf78_f907888f-168a-435b-9326-465634e93710/check-endpoints/0.log" Apr 17 14:57:17.639640 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:17.639608 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mcl6c_080d6200-63b7-4e65-8d68-ea319212caed/network-check-target-container/0.log" Apr 17 14:57:18.588158 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:18.588134 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-nczp7_a2839c1e-60df-4132-aed5-549b23baa1fb/iptables-alerter/0.log" Apr 17 14:57:19.153319 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:19.153257 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-fmns5_54a01f80-7898-48d4-9c07-39869c129452/tuned/0.log" Apr 17 14:57:22.127162 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:22.127121 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-2zqlc_40d81942-12f4-4f8a-8842-de8e2a878a0a/service-ca-controller/0.log" Apr 17 14:57:22.475394 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:22.475366 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gjdx8_fa5741d3-e0ce-42fe-9791-6fce2bd6da17/csi-driver/0.log" Apr 17 14:57:22.493872 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:22.493844 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gjdx8_fa5741d3-e0ce-42fe-9791-6fce2bd6da17/csi-node-driver-registrar/0.log" Apr 17 14:57:22.510229 ip-10-0-138-158 kubenswrapper[2568]: I0417 14:57:22.510205 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gjdx8_fa5741d3-e0ce-42fe-9791-6fce2bd6da17/csi-liveness-probe/0.log"