Apr 16 13:54:32.650319 ip-10-0-131-61 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:54:32.650333 ip-10-0-131-61 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:54:32.650342 ip-10-0-131-61 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:54:32.650806 ip-10-0-131-61 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:54:42.783529 ip-10-0-131-61 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:54:42.783550 ip-10-0-131-61 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b5c41f751de64dbc95db5ba1b3c81d3f -- Apr 16 13:57:16.154250 ip-10-0-131-61 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:57:16.600866 ip-10-0-131-61 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:57:16.600866 ip-10-0-131-61 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:57:16.600866 ip-10-0-131-61 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:57:16.600866 ip-10-0-131-61 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:57:16.600866 ip-10-0-131-61 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:57:16.604093 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.603992 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:57:16.607921 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607903 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:57:16.607921 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607921 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607925 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607929 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607934 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607937 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607940 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607943 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607946 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607949 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607952 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607955 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607958 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607960 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607964 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607967 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607970 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607973 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607975 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607978 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607981 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:57:16.607987 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607983 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607986 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607989 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607992 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607995 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.607998 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608001 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608004 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608007 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608009 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608018 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608021 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608023 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608026 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608029 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608033 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608037 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608040 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608042 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:57:16.608452 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608045 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608049 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608053 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608056 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608059 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608062 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608065 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608068 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608071 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608073 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608075 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608078 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608081 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608083 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608086 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608090 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608093 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608096 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608099 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:57:16.609022 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608102 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608105 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608107 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608110 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608112 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608115 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608118 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608120 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608123 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608125 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608129 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608132 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608135 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608137 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608140 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608142 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608145 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608147 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608150 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608153 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:57:16.609480 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608155 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608160 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608165 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608167 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608170 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608173 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608175 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608591 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608597 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608599 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608602 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608605 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608608 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608611 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608614 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608616 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608619 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608621 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608624 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608627 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:57:16.609966 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608629 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608659 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608663 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608667 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608670 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608673 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608676 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608679 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608682 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608686 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608688 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608691 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608694 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608697 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608699 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608702 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608705 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608707 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608710 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608712 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:57:16.610446 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608716 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608719 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608722 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608724 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608727 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608730 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608732 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608735 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608737 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608740 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608743 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608746 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608748 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608751 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608753 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608756 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608759 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608762 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608764 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:57:16.610988 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608767 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608769 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608772 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608776 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608779 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608782 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608785 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608788 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608790 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608793 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608796 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608798 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608801 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608804 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608810 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608813 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608816 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608819 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608822 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:57:16.611448 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608825 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608828 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608831 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608833 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608836 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608838 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608841 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608843 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608846 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608848 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608851 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608854 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608857 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608859 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.608862 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610062 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610074 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610081 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610086 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610091 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610094 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:57:16.611916 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610099 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610104 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610107 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610110 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610114 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610119 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610122 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610125 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610128 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610132 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610135 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610138 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610141 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610146 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610149 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610152 2575 flags.go:64] FLAG: --config-dir="" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610155 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610159 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610163 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610166 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610169 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610173 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610176 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610180 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:57:16.612432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610183 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610186 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610190 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610195 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610198 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610201 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610204 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610207 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610211 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610215 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610218 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610221 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610225 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610228 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610232 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610236 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610239 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610243 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610246 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610249 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610252 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610255 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610258 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610261 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610264 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 13:57:16.613034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610268 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610271 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610274 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610278 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610281 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610284 2575 flags.go:64] FLAG: --help="false" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610287 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-131-61.ec2.internal" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610291 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610294 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610297 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610300 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610303 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610307 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610310 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610313 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610316 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610319 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610322 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610325 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610328 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610331 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610335 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610338 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610341 2575 flags.go:64] FLAG: --lock-file="" Apr 16 13:57:16.613630 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610344 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610347 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610350 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610356 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610359 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610362 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610364 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610367 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610371 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610374 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610377 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610381 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610384 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610389 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610392 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610395 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610399 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610402 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610405 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610408 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610414 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610423 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610426 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610429 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:57:16.614217 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610432 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610436 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610442 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610445 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610448 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610452 2575 flags.go:64] FLAG: --port="10250" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610455 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610458 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03d3a500efed9d33a" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610462 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610465 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610468 2575 flags.go:64] FLAG: --register-node="true" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610471 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610473 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610477 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610480 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610483 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610486 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610490 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610493 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610496 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610499 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610502 2575 flags.go:64] FLAG: --runonce="false" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610505 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610508 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610518 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:57:16.614791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610521 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610524 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610527 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610531 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610534 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610537 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610540 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610543 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610545 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610548 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610551 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610555 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610563 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610566 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610568 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610575 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610578 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610581 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610585 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610587 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610591 2575 flags.go:64] FLAG: --v="2" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610595 2575 flags.go:64] FLAG: --version="false" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610599 2575 flags.go:64] FLAG: --vmodule="" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610604 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.610607 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:57:16.615398 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610721 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610725 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610730 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610733 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610737 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610739 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610743 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610752 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610755 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610757 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610762 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610765 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610767 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610770 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610772 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610775 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610778 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610780 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610783 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:57:16.616057 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610787 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610790 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610792 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610795 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610798 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610801 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610804 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610806 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610809 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610811 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610814 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610816 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610819 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610822 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610824 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610828 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610830 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610833 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610837 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:57:16.616524 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610840 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610843 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610851 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610854 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610858 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610861 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610864 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610866 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610868 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610871 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610874 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610876 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610879 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610884 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610886 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610889 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610907 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610910 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610912 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:57:16.617038 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610915 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610917 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610920 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610923 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610925 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610928 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610931 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610934 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610936 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610939 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610941 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610944 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610946 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610949 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610952 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610954 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610962 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610966 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610969 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610972 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:57:16.617527 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610974 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:57:16.618148 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610977 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:57:16.618148 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610979 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:57:16.618148 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610982 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:57:16.618148 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610985 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:57:16.618148 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610987 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:57:16.618148 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610991 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:57:16.618148 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610994 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:57:16.618148 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.610998 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:57:16.618148 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.611851 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:57:16.620040 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.619912 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:57:16.620040 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.620038 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620091 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620097 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620100 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620103 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620106 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620109 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620112 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620115 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620119 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620124 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620127 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620130 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620133 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620136 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620138 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620142 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620144 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620147 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:57:16.620176 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620150 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620152 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620155 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620158 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620160 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620163 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620165 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620168 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620171 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620173 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620176 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620179 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620182 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620185 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620188 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620191 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620195 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620199 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620201 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620204 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:57:16.620639 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620207 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620210 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620212 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620215 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620217 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620220 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620223 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620225 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620228 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620230 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620233 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620236 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620238 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620240 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620244 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620247 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620250 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620253 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620255 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620258 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:57:16.621181 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620261 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620263 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620266 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620269 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620271 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620274 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620276 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620278 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620281 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620284 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620286 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620289 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620292 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620294 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620297 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620299 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620302 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620304 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620308 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620310 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:57:16.621665 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620313 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620316 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620318 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620321 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620324 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620326 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620330 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620333 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.620338 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620444 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620449 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620453 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620455 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620458 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620461 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620464 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:57:16.622172 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620467 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620469 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620472 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620474 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620479 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620483 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620486 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620489 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620491 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620494 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620496 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620499 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620503 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620506 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620509 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620512 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620515 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620517 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620520 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:57:16.622562 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620522 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620525 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620528 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620531 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620534 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620537 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620539 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620542 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620545 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620547 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620550 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620552 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620555 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620557 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620560 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620562 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620565 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620567 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620570 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620572 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:57:16.623035 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620575 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620577 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620580 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620582 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620585 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620587 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620590 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620592 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620595 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620598 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620601 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620603 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620606 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620608 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620611 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620613 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620616 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620619 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620621 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620624 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:57:16.623526 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620626 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620629 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620632 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620634 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620637 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620639 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620642 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620645 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620647 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620650 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620653 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620655 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620658 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620660 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620663 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620665 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620667 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620670 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620673 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:57:16.624020 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:16.620676 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:57:16.624509 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.620681 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:57:16.624509 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.621390 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:57:16.624509 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.624467 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:57:16.625354 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.625339 2575 server.go:1019] "Starting client certificate rotation" Apr 16 13:57:16.625464 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.625446 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:57:16.625500 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.625493 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:57:16.657108 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.657086 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:57:16.661856 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.661833 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:57:16.679079 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.679060 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:57:16.684682 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.684666 2575 log.go:25] "Validated CRI v1 image API" Apr 16 13:57:16.686285 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.686267 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:57:16.689061 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.689044 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:57:16.690851 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.690830 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a38a2c54-b1bf-4bac-a392-4dcaa68ae9ef:/dev/nvme0n1p4 ec7175b4-79db-44f4-8c4f-51155cf1f44a:/dev/nvme0n1p3] Apr 16 13:57:16.690947 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.690849 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:57:16.697941 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.697812 2575 manager.go:217] Machine: {Timestamp:2026-04-16 13:57:16.695586043 +0000 UTC m=+0.423639779 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100525 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29748171962788b10a3b4cfad1e1d9 SystemUUID:ec297481-7196-2788-b10a-3b4cfad1e1d9 BootID:b5c41f75-1de6-4dbc-95db-5ba1b3c81d3f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:94:c7:33:77:83 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:94:c7:33:77:83 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9a:25:eb:d8:a0:16 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:57:16.697941 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.697929 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:57:16.698083 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.698062 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:57:16.699033 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.699005 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:57:16.699173 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.699036 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-61.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:57:16.699220 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.699183 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:57:16.699220 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.699190 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:57:16.699220 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.699203 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:57:16.699922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.699911 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:57:16.701654 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.701642 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:57:16.701762 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.701753 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:57:16.704069 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.704059 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:57:16.704485 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.704474 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:57:16.704527 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.704493 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:57:16.704527 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.704503 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:57:16.704527 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.704513 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:57:16.706732 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.706717 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:57:16.706791 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.706737 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:57:16.709059 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.709037 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mhkkh" Apr 16 13:57:16.710012 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.709995 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:57:16.711793 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.711777 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:57:16.713556 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713541 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:57:16.713641 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713561 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:57:16.713641 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713571 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:57:16.713641 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713579 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:57:16.713641 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713586 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:57:16.713641 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713596 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:57:16.713641 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713605 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:57:16.713641 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713613 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:57:16.713641 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713632 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:57:16.713641 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713642 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:57:16.713941 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713661 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:57:16.713941 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.713675 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:57:16.714478 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.714459 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:57:16.714521 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.714463 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-61.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:57:16.714763 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.714751 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:57:16.714811 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.714768 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:57:16.716789 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.716767 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mhkkh" Apr 16 13:57:16.718315 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.718299 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:57:16.718391 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.718344 2575 server.go:1295] "Started kubelet" Apr 16 13:57:16.718455 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.718427 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:57:16.719050 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.718991 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:57:16.719119 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.719082 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:57:16.719391 ip-10-0-131-61 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:57:16.720162 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.720146 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:57:16.720396 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.720383 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:57:16.724964 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.724946 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:57:16.725127 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.725106 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:57:16.725931 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.725910 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:57:16.725931 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.725931 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:57:16.726072 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.725952 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:57:16.726072 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.726013 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:57:16.726072 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.726024 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:57:16.726322 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.726297 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:16.726674 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.726657 2575 factory.go:153] Registering CRI-O factory Apr 16 13:57:16.726757 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.726718 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 13:57:16.726808 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.726786 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:57:16.726808 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.726796 2575 factory.go:55] Registering systemd factory Apr 16 13:57:16.726808 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.726805 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:57:16.726964 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.726825 2575 factory.go:103] Registering Raw factory Apr 16 13:57:16.726964 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.726835 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 13:57:16.727328 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.727312 2575 manager.go:319] Starting recovery of all containers Apr 16 13:57:16.729156 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.729125 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:16.730050 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.730022 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:57:16.730517 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.730496 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-61.ec2.internal\" not found" node="ip-10-0-131-61.ec2.internal" Apr 16 13:57:16.732102 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.732084 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-61.ec2.internal" not found Apr 16 13:57:16.739242 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.739225 2575 manager.go:324] Recovery completed Apr 16 13:57:16.743373 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.743357 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:16.747665 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.747643 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-61.ec2.internal" not found Apr 16 13:57:16.749758 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.749742 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:16.749828 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.749770 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:16.749828 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.749780 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:16.750259 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.750247 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:57:16.750259 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.750258 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:57:16.750346 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.750275 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:57:16.752511 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.752499 2575 policy_none.go:49] "None policy: Start" Apr 16 13:57:16.752559 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.752516 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:57:16.752559 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.752526 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:57:16.793488 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.793472 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 13:57:16.793611 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.793546 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:57:16.793611 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.793558 2575 server.go:85] "Starting device plugin registration server" Apr 16 13:57:16.793833 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.793822 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:57:16.793871 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.793836 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:57:16.793964 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.793950 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:57:16.794043 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.794028 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:57:16.794043 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.794040 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:57:16.794879 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.794860 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:57:16.794975 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.794915 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:16.803944 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.803928 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-61.ec2.internal" not found Apr 16 13:57:16.848443 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.848404 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:57:16.849623 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.849591 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:57:16.849623 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.849620 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:57:16.849794 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.849642 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:57:16.849794 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.849648 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:57:16.849794 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.849734 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:57:16.852511 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.852457 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:16.894693 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.894663 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:16.896597 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.896576 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:16.896703 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.896612 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:16.896703 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.896627 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:16.896703 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.896656 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-61.ec2.internal" Apr 16 13:57:16.906533 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.906513 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-61.ec2.internal" Apr 16 13:57:16.906601 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.906539 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-61.ec2.internal\": node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:16.923447 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.923428 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:16.950414 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.950359 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-61.ec2.internal"] Apr 16 13:57:16.950525 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.950486 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:16.951446 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.951431 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:16.951532 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.951460 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:16.951532 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.951471 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:16.952864 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.952852 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:16.953021 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.953006 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" Apr 16 13:57:16.953066 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.953037 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:16.955002 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.954987 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:16.955065 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.955017 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:16.955065 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.955027 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:16.955133 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.954989 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:16.955133 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.955088 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:16.955133 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.955098 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:16.956643 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.956627 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-61.ec2.internal" Apr 16 13:57:16.956740 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.956653 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:57:16.957432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.957413 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:57:16.957513 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.957445 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:57:16.957513 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:16.957471 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:57:16.978604 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.978578 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-61.ec2.internal\" not found" node="ip-10-0-131-61.ec2.internal" Apr 16 13:57:16.983234 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:16.983217 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-61.ec2.internal\" not found" node="ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.024171 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:17.024147 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:17.125236 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:17.125139 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:17.127527 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.127505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/162c9ac6c39f8264db22914a3448ec16-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal\" (UID: \"162c9ac6c39f8264db22914a3448ec16\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.127638 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.127541 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/162c9ac6c39f8264db22914a3448ec16-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal\" (UID: \"162c9ac6c39f8264db22914a3448ec16\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.127638 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.127567 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/aeb1bbb91159af60fd67f2df28938806-config\") pod \"kube-apiserver-proxy-ip-10-0-131-61.ec2.internal\" (UID: \"aeb1bbb91159af60fd67f2df28938806\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.225616 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:17.225584 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:17.227790 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.227776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/162c9ac6c39f8264db22914a3448ec16-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal\" (UID: \"162c9ac6c39f8264db22914a3448ec16\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.227845 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.227798 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/162c9ac6c39f8264db22914a3448ec16-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal\" (UID: \"162c9ac6c39f8264db22914a3448ec16\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.227845 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.227823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/aeb1bbb91159af60fd67f2df28938806-config\") pod \"kube-apiserver-proxy-ip-10-0-131-61.ec2.internal\" (UID: \"aeb1bbb91159af60fd67f2df28938806\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.227920 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.227868 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/aeb1bbb91159af60fd67f2df28938806-config\") pod \"kube-apiserver-proxy-ip-10-0-131-61.ec2.internal\" (UID: \"aeb1bbb91159af60fd67f2df28938806\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.227920 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.227889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/162c9ac6c39f8264db22914a3448ec16-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal\" (UID: \"162c9ac6c39f8264db22914a3448ec16\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.227987 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.227918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/162c9ac6c39f8264db22914a3448ec16-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal\" (UID: \"162c9ac6c39f8264db22914a3448ec16\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.281989 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.281943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.286055 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.286038 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.325844 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:17.325802 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:17.426265 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:17.426182 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:17.526622 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:17.526588 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:17.624875 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.624845 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:57:17.625609 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.625021 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:57:17.625609 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.625068 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:57:17.627042 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:17.627020 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:17.718952 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.718824 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:52:16 +0000 UTC" deadline="2027-11-09 12:19:04.814557961 +0000 UTC" Apr 16 13:57:17.718952 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.718867 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13726h21m47.095694167s" Apr 16 13:57:17.726000 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.725978 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:57:17.727963 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:17.727944 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-61.ec2.internal\" not found" Apr 16 13:57:17.744853 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.744824 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:57:17.774137 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.774113 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tzk6m" Apr 16 13:57:17.781606 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.781588 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tzk6m" Apr 16 13:57:17.801182 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.801156 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:17.826171 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.826138 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.836290 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.836268 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:57:17.839343 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.839330 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" Apr 16 13:57:17.849496 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:17.849476 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:57:18.000384 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:18.000343 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod162c9ac6c39f8264db22914a3448ec16.slice/crio-c587e565d676f212e211cba1adfd280d3cfea936590e2ca2df415b8c3a1bf654 WatchSource:0}: Error finding container c587e565d676f212e211cba1adfd280d3cfea936590e2ca2df415b8c3a1bf654: Status 404 returned error can't find the container with id c587e565d676f212e211cba1adfd280d3cfea936590e2ca2df415b8c3a1bf654 Apr 16 13:57:18.002191 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:18.002153 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeb1bbb91159af60fd67f2df28938806.slice/crio-92b5c4e5d663673e4bb8abc476faf4556cd9f330d5b3cf0d80d8bb246dbfcdc7 WatchSource:0}: Error finding container 92b5c4e5d663673e4bb8abc476faf4556cd9f330d5b3cf0d80d8bb246dbfcdc7: Status 404 returned error can't find the container with id 92b5c4e5d663673e4bb8abc476faf4556cd9f330d5b3cf0d80d8bb246dbfcdc7 Apr 16 13:57:18.009369 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.009349 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:57:18.221097 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.221069 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:18.705854 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.705818 2575 apiserver.go:52] "Watching apiserver" Apr 16 13:57:18.712969 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.712927 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:57:18.713444 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.713416 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-61.ec2.internal","openshift-cluster-node-tuning-operator/tuned-z8t6x","openshift-dns/node-resolver-h7pm6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal","openshift-multus/multus-additional-cni-plugins-mj78x","openshift-network-diagnostics/network-check-target-5zjkj","openshift-network-operator/iptables-alerter-jzfpd","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk","openshift-image-registry/node-ca-9tz2x","openshift-multus/multus-4wgkv","openshift-multus/network-metrics-daemon-l5zhh","openshift-ovn-kubernetes/ovnkube-node-w7c7s","kube-system/konnectivity-agent-6r2ft"] Apr 16 13:57:18.715311 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.715285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9tz2x" Apr 16 13:57:18.716489 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.716467 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.718055 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.717757 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7vmbj\"" Apr 16 13:57:18.718171 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.717878 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:57:18.718171 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.717878 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:57:18.718171 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.717941 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:57:18.718612 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.718595 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:57:18.718829 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.718809 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:18.718888 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.718847 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-rjz29\"" Apr 16 13:57:18.718969 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:18.718878 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:18.719027 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.719017 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:57:18.720461 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.720440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:18.722330 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.721828 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.722438 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.722424 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dbr77\"" Apr 16 13:57:18.722504 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.722467 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:57:18.722649 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.722619 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:57:18.723302 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.723283 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:18.723388 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:18.723350 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:18.723860 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.723842 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:57:18.723978 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.723870 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:57:18.724078 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.724060 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:57:18.724153 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.724062 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:57:18.725439 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.725154 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:57:18.725439 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.725239 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jzfpd" Apr 16 13:57:18.725691 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.724068 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fjrgw\"" Apr 16 13:57:18.727998 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.725916 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.728235 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.728063 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:57:18.728396 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.728378 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:57:18.729256 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.729234 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:57:18.729390 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.729354 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2jf46\"" Apr 16 13:57:18.729390 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.729374 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:57:18.729562 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.729245 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:57:18.729783 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.729761 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-5s4x2\"" Apr 16 13:57:18.729985 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.729955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.730062 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.730031 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:57:18.731962 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.731941 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fb74c\"" Apr 16 13:57:18.732097 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.732078 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:57:18.732499 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.732480 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.733553 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c64hh\" (UniqueName: \"kubernetes.io/projected/eea23aa3-1d75-42db-b520-0bce4b998c9c-kube-api-access-c64hh\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.733663 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733567 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-multus-cni-dir\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.733663 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-os-release\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.733663 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733617 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:18.733663 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-run-netns\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.733884 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733666 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03875a5c-704e-42af-9ea6-ba5a9f181d94-serviceca\") pod \"node-ca-9tz2x\" (UID: \"03875a5c-704e-42af-9ea6-ba5a9f181d94\") " pod="openshift-image-registry/node-ca-9tz2x" Apr 16 13:57:18.733884 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9d11af49-8358-49ac-ac63-a39ae3da3f3c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.733884 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9d4j\" (UniqueName: \"kubernetes.io/projected/9d11af49-8358-49ac-ac63-a39ae3da3f3c-kube-api-access-l9d4j\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.733884 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br4fr\" (UniqueName: \"kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr\") pod \"network-check-target-5zjkj\" (UID: \"6e53c142-6f4e-4358-a390-6d3c43558ef6\") " pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:18.733884 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7e94555a-67da-4de5-ace1-024c7384ada8-agent-certs\") pod \"konnectivity-agent-6r2ft\" (UID: \"7e94555a-67da-4de5-ace1-024c7384ada8\") " pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:18.733884 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-var-lib-cni-bin\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.733884 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733841 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h7pm6" Apr 16 13:57:18.733884 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-var-lib-kubelet\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.733884 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw7xj\" (UniqueName: \"kubernetes.io/projected/03875a5c-704e-42af-9ea6-ba5a9f181d94-kube-api-access-vw7xj\") pod \"node-ca-9tz2x\" (UID: \"03875a5c-704e-42af-9ea6-ba5a9f181d94\") " pod="openshift-image-registry/node-ca-9tz2x" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733917 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-sysctl-conf\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733942 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-var-lib-kubelet\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g8gf\" (UniqueName: \"kubernetes.io/projected/985ce2bb-b7a7-4b0c-893a-1840235f7653-kube-api-access-7g8gf\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.733989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-socket-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-systemd\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/985ce2bb-b7a7-4b0c-893a-1840235f7653-tmp\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734059 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-etc-kubernetes\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734105 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f95jb\" (UniqueName: \"kubernetes.io/projected/0675f452-3368-4384-83f9-0c6a166ba947-kube-api-access-f95jb\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734126 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03875a5c-704e-42af-9ea6-ba5a9f181d94-host\") pod \"node-ca-9tz2x\" (UID: \"03875a5c-704e-42af-9ea6-ba5a9f181d94\") " pod="openshift-image-registry/node-ca-9tz2x" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-multus-socket-dir-parent\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-var-lib-cni-multus\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-cnibin\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9d11af49-8358-49ac-ac63-a39ae3da3f3c-cni-binary-copy\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-registration-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734233 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.734376 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-lib-modules\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5837407c-436b-4756-9677-8c40ff8e9059-iptables-alerter-script\") pod \"iptables-alerter-jzfpd\" (UID: \"5837407c-436b-4756-9677-8c40ff8e9059\") " pod="openshift-network-operator/iptables-alerter-jzfpd" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0675f452-3368-4384-83f9-0c6a166ba947-multus-daemon-config\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734316 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2cg4\" (UniqueName: \"kubernetes.io/projected/9ab97e35-4539-405d-bb91-f30c906963c2-kube-api-access-l2cg4\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-run\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7e94555a-67da-4de5-ace1-024c7384ada8-konnectivity-ca\") pod \"konnectivity-agent-6r2ft\" (UID: \"7e94555a-67da-4de5-ace1-024c7384ada8\") " pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f655v\" (UniqueName: \"kubernetes.io/projected/5837407c-436b-4756-9677-8c40ff8e9059-kube-api-access-f655v\") pod \"iptables-alerter-jzfpd\" (UID: \"5837407c-436b-4756-9677-8c40ff8e9059\") " pod="openshift-network-operator/iptables-alerter-jzfpd" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-system-cni-dir\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734419 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-multus-conf-dir\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-run-multus-certs\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-device-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5837407c-436b-4756-9677-8c40ff8e9059-host-slash\") pod \"iptables-alerter-jzfpd\" (UID: \"5837407c-436b-4756-9677-8c40ff8e9059\") " pod="openshift-network-operator/iptables-alerter-jzfpd" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734596 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0675f452-3368-4384-83f9-0c6a166ba947-cni-binary-copy\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734624 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-kubernetes\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-sysctl-d\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734684 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-os-release\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.735246 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734722 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734735 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-sys-fs\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-modprobe-d\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-sys\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-sysconfig\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-system-cni-dir\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734918 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9d11af49-8358-49ac-ac63-a39ae3da3f3c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-tuned\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734967 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-run-k8s-cni-cncf-io\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.734991 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-hostroot\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.735016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-etc-selinux\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.735036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-host\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.735057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-cnibin\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.735146 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.735336 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.735501 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:57:18.736196 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.735593 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:57:18.737123 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.735870 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:57:18.737123 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.736666 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-9c8gz\"" Apr 16 13:57:18.737123 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.736698 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:57:18.737123 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.736717 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-k9nz4\"" Apr 16 13:57:18.737123 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.736648 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:57:18.783728 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.783693 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:52:17 +0000 UTC" deadline="2028-01-26 03:40:15.94768586 +0000 UTC" Apr 16 13:57:18.783728 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.783724 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15589h42m57.163965701s" Apr 16 13:57:18.827237 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.827195 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:57:18.836106 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-os-release\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.836282 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af33c15c-6386-4fe7-9155-a5c6b6e05ec4-tmp-dir\") pod \"node-resolver-h7pm6\" (UID: \"af33c15c-6386-4fe7-9155-a5c6b6e05ec4\") " pod="openshift-dns/node-resolver-h7pm6" Apr 16 13:57:18.836282 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836152 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-kubelet\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.836282 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.836282 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836206 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-sys-fs\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.836282 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-modprobe-d\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.836282 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-sys\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.836282 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-run-systemd\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836313 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-ovn-node-metrics-cert\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh7mr\" (UniqueName: \"kubernetes.io/projected/af33c15c-6386-4fe7-9155-a5c6b6e05ec4-kube-api-access-xh7mr\") pod \"node-resolver-h7pm6\" (UID: \"af33c15c-6386-4fe7-9155-a5c6b6e05ec4\") " pod="openshift-dns/node-resolver-h7pm6" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-sysconfig\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-system-cni-dir\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836430 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9d11af49-8358-49ac-ac63-a39ae3da3f3c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-tuned\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-run-ovn\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-run-ovn-kubernetes\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836530 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-cni-bin\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-sys\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-run-k8s-cni-cncf-io\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836595 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-run-k8s-cni-cncf-io\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-hostroot\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.836619 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-system-cni-dir\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-etc-openvswitch\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-run-openvswitch\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-sysconfig\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-env-overrides\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-os-release\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836751 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-sys-fs\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-etc-selinux\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836775 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-hostroot\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-etc-selinux\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836848 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-host\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-cnibin\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836937 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-host\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-cnibin\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c64hh\" (UniqueName: \"kubernetes.io/projected/eea23aa3-1d75-42db-b520-0bce4b998c9c-kube-api-access-c64hh\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836954 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-modprobe-d\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.836983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-multus-cni-dir\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.837339 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-os-release\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-run-netns\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837085 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-multus-cni-dir\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03875a5c-704e-42af-9ea6-ba5a9f181d94-serviceca\") pod \"node-ca-9tz2x\" (UID: \"03875a5c-704e-42af-9ea6-ba5a9f181d94\") " pod="openshift-image-registry/node-ca-9tz2x" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-os-release\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837138 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-run-netns\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837159 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837168 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9d11af49-8358-49ac-ac63-a39ae3da3f3c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:18.837178 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9d11af49-8358-49ac-ac63-a39ae3da3f3c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:18.837259 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs podName:9ab97e35-4539-405d-bb91-f30c906963c2 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:19.337233477 +0000 UTC m=+3.065287186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs") pod "network-metrics-daemon-l5zhh" (UID: "9ab97e35-4539-405d-bb91-f30c906963c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9d4j\" (UniqueName: \"kubernetes.io/projected/9d11af49-8358-49ac-ac63-a39ae3da3f3c-kube-api-access-l9d4j\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br4fr\" (UniqueName: \"kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr\") pod \"network-check-target-5zjkj\" (UID: \"6e53c142-6f4e-4358-a390-6d3c43558ef6\") " pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03875a5c-704e-42af-9ea6-ba5a9f181d94-serviceca\") pod \"node-ca-9tz2x\" (UID: \"03875a5c-704e-42af-9ea6-ba5a9f181d94\") " pod="openshift-image-registry/node-ca-9tz2x" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7e94555a-67da-4de5-ace1-024c7384ada8-agent-certs\") pod \"konnectivity-agent-6r2ft\" (UID: \"7e94555a-67da-4de5-ace1-024c7384ada8\") " pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:18.838230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-var-lib-cni-bin\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-var-lib-kubelet\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.837983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw7xj\" (UniqueName: \"kubernetes.io/projected/03875a5c-704e-42af-9ea6-ba5a9f181d94-kube-api-access-vw7xj\") pod \"node-ca-9tz2x\" (UID: \"03875a5c-704e-42af-9ea6-ba5a9f181d94\") " pod="openshift-image-registry/node-ca-9tz2x" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838011 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-ovnkube-script-lib\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-sysctl-conf\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-var-lib-kubelet\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7g8gf\" (UniqueName: \"kubernetes.io/projected/985ce2bb-b7a7-4b0c-893a-1840235f7653-kube-api-access-7g8gf\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-var-lib-openvswitch\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-log-socket\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-cni-netd\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9d11af49-8358-49ac-ac63-a39ae3da3f3c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-ovnkube-config\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-socket-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-systemd\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838274 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-var-lib-cni-bin\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.839018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838292 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/985ce2bb-b7a7-4b0c-893a-1840235f7653-tmp\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.840696 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-var-lib-kubelet\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.840696 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.838316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-etc-kubernetes\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.840696 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.840440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f95jb\" (UniqueName: \"kubernetes.io/projected/0675f452-3368-4384-83f9-0c6a166ba947-kube-api-access-f95jb\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.840696 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.840467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03875a5c-704e-42af-9ea6-ba5a9f181d94-host\") pod \"node-ca-9tz2x\" (UID: \"03875a5c-704e-42af-9ea6-ba5a9f181d94\") " pod="openshift-image-registry/node-ca-9tz2x" Apr 16 13:57:18.840696 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.840497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-multus-socket-dir-parent\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.840696 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.840528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-var-lib-cni-multus\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.840696 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.840557 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-cnibin\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.840696 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.840587 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9d11af49-8358-49ac-ac63-a39ae3da3f3c-cni-binary-copy\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.840696 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.840612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-registration-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.840696 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.840644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-systemd-units\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.840696 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.840674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.841776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-lib-modules\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.841051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-var-lib-cni-multus\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.841068 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-var-lib-kubelet\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.841403 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03875a5c-704e-42af-9ea6-ba5a9f181d94-host\") pod \"node-ca-9tz2x\" (UID: \"03875a5c-704e-42af-9ea6-ba5a9f181d94\") " pod="openshift-image-registry/node-ca-9tz2x" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.841847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-cnibin\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.841460 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-systemd\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.841506 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-multus-socket-dir-parent\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842023 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-sysctl-conf\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.840981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-etc-kubernetes\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5837407c-436b-4756-9677-8c40ff8e9059-iptables-alerter-script\") pod \"iptables-alerter-jzfpd\" (UID: \"5837407c-436b-4756-9677-8c40ff8e9059\") " pod="openshift-network-operator/iptables-alerter-jzfpd" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0675f452-3368-4384-83f9-0c6a166ba947-multus-daemon-config\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9d11af49-8358-49ac-ac63-a39ae3da3f3c-cni-binary-copy\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2cg4\" (UniqueName: \"kubernetes.io/projected/9ab97e35-4539-405d-bb91-f30c906963c2-kube-api-access-l2cg4\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842207 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-lib-modules\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.841405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-socket-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842233 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/af33c15c-6386-4fe7-9155-a5c6b6e05ec4-hosts-file\") pod \"node-resolver-h7pm6\" (UID: \"af33c15c-6386-4fe7-9155-a5c6b6e05ec4\") " pod="openshift-dns/node-resolver-h7pm6" Apr 16 13:57:18.843922 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842330 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-registration-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-run-netns\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842483 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-run\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7e94555a-67da-4de5-ace1-024c7384ada8-konnectivity-ca\") pod \"konnectivity-agent-6r2ft\" (UID: \"7e94555a-67da-4de5-ace1-024c7384ada8\") " pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f655v\" (UniqueName: \"kubernetes.io/projected/5837407c-436b-4756-9677-8c40ff8e9059-kube-api-access-f655v\") pod \"iptables-alerter-jzfpd\" (UID: \"5837407c-436b-4756-9677-8c40ff8e9059\") " pod="openshift-network-operator/iptables-alerter-jzfpd" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-system-cni-dir\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-multus-conf-dir\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-run-multus-certs\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842684 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-slash\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-node-log\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-device-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5837407c-436b-4756-9677-8c40ff8e9059-host-slash\") pod \"iptables-alerter-jzfpd\" (UID: \"5837407c-436b-4756-9677-8c40ff8e9059\") " pod="openshift-network-operator/iptables-alerter-jzfpd" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0675f452-3368-4384-83f9-0c6a166ba947-multus-daemon-config\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5837407c-436b-4756-9677-8c40ff8e9059-iptables-alerter-script\") pod \"iptables-alerter-jzfpd\" (UID: \"5837407c-436b-4756-9677-8c40ff8e9059\") " pod="openshift-network-operator/iptables-alerter-jzfpd" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0675f452-3368-4384-83f9-0c6a166ba947-cni-binary-copy\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-run\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhv89\" (UniqueName: \"kubernetes.io/projected/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-kube-api-access-bhv89\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.844717 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.842957 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9d11af49-8358-49ac-ac63-a39ae3da3f3c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.845450 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.843026 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5837407c-436b-4756-9677-8c40ff8e9059-host-slash\") pod \"iptables-alerter-jzfpd\" (UID: \"5837407c-436b-4756-9677-8c40ff8e9059\") " pod="openshift-network-operator/iptables-alerter-jzfpd" Apr 16 13:57:18.845450 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.843268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-kubernetes\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.845450 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.843329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-sysctl-d\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.845450 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.843522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-sysctl-d\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.845450 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.843735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-system-cni-dir\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.845450 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.843802 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-multus-conf-dir\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.845450 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.843851 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0675f452-3368-4384-83f9-0c6a166ba947-host-run-multus-certs\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.845450 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.843971 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/eea23aa3-1d75-42db-b520-0bce4b998c9c-device-dir\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.845450 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.844037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-kubernetes\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.845450 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.844053 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0675f452-3368-4384-83f9-0c6a166ba947-cni-binary-copy\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.845450 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.844541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7e94555a-67da-4de5-ace1-024c7384ada8-agent-certs\") pod \"konnectivity-agent-6r2ft\" (UID: \"7e94555a-67da-4de5-ace1-024c7384ada8\") " pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:18.846304 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.846150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7e94555a-67da-4de5-ace1-024c7384ada8-konnectivity-ca\") pod \"konnectivity-agent-6r2ft\" (UID: \"7e94555a-67da-4de5-ace1-024c7384ada8\") " pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:18.848721 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:18.848439 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:18.848721 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:18.848463 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:18.848721 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:18.848478 2575 projected.go:194] Error preparing data for projected volume kube-api-access-br4fr for pod openshift-network-diagnostics/network-check-target-5zjkj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:18.848721 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:18.848551 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr podName:6e53c142-6f4e-4358-a390-6d3c43558ef6 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:19.348531967 +0000 UTC m=+3.076585684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-br4fr" (UniqueName: "kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr") pod "network-check-target-5zjkj" (UID: "6e53c142-6f4e-4358-a390-6d3c43558ef6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:18.849467 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.849423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/985ce2bb-b7a7-4b0c-893a-1840235f7653-etc-tuned\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.850724 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.850597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/985ce2bb-b7a7-4b0c-893a-1840235f7653-tmp\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.852673 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.852643 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c64hh\" (UniqueName: \"kubernetes.io/projected/eea23aa3-1d75-42db-b520-0bce4b998c9c-kube-api-access-c64hh\") pod \"aws-ebs-csi-driver-node-j2fnk\" (UID: \"eea23aa3-1d75-42db-b520-0bce4b998c9c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:18.854221 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.854202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2cg4\" (UniqueName: \"kubernetes.io/projected/9ab97e35-4539-405d-bb91-f30c906963c2-kube-api-access-l2cg4\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:18.855351 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.855326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9d4j\" (UniqueName: \"kubernetes.io/projected/9d11af49-8358-49ac-ac63-a39ae3da3f3c-kube-api-access-l9d4j\") pod \"multus-additional-cni-plugins-mj78x\" (UID: \"9d11af49-8358-49ac-ac63-a39ae3da3f3c\") " pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:18.855668 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.855637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g8gf\" (UniqueName: \"kubernetes.io/projected/985ce2bb-b7a7-4b0c-893a-1840235f7653-kube-api-access-7g8gf\") pod \"tuned-z8t6x\" (UID: \"985ce2bb-b7a7-4b0c-893a-1840235f7653\") " pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:18.857011 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.856797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f655v\" (UniqueName: \"kubernetes.io/projected/5837407c-436b-4756-9677-8c40ff8e9059-kube-api-access-f655v\") pod \"iptables-alerter-jzfpd\" (UID: \"5837407c-436b-4756-9677-8c40ff8e9059\") " pod="openshift-network-operator/iptables-alerter-jzfpd" Apr 16 13:57:18.857011 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.856930 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-61.ec2.internal" event={"ID":"aeb1bbb91159af60fd67f2df28938806","Type":"ContainerStarted","Data":"92b5c4e5d663673e4bb8abc476faf4556cd9f330d5b3cf0d80d8bb246dbfcdc7"} Apr 16 13:57:18.857491 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.857462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw7xj\" (UniqueName: \"kubernetes.io/projected/03875a5c-704e-42af-9ea6-ba5a9f181d94-kube-api-access-vw7xj\") pod \"node-ca-9tz2x\" (UID: \"03875a5c-704e-42af-9ea6-ba5a9f181d94\") " pod="openshift-image-registry/node-ca-9tz2x" Apr 16 13:57:18.857662 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.857645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f95jb\" (UniqueName: \"kubernetes.io/projected/0675f452-3368-4384-83f9-0c6a166ba947-kube-api-access-f95jb\") pod \"multus-4wgkv\" (UID: \"0675f452-3368-4384-83f9-0c6a166ba947\") " pod="openshift-multus/multus-4wgkv" Apr 16 13:57:18.857883 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.857861 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" event={"ID":"162c9ac6c39f8264db22914a3448ec16","Type":"ContainerStarted","Data":"c587e565d676f212e211cba1adfd280d3cfea936590e2ca2df415b8c3a1bf654"} Apr 16 13:57:18.944419 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944583 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-ovnkube-script-lib\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944583 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-var-lib-openvswitch\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944583 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-log-socket\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944583 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-cni-netd\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944583 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-ovnkube-config\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944583 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944583 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944577 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-systemd-units\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-log-socket\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/af33c15c-6386-4fe7-9155-a5c6b6e05ec4-hosts-file\") pod \"node-resolver-h7pm6\" (UID: \"af33c15c-6386-4fe7-9155-a5c6b6e05ec4\") " pod="openshift-dns/node-resolver-h7pm6" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-var-lib-openvswitch\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-run-netns\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-slash\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944699 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-run-netns\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-node-log\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-cni-netd\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhv89\" (UniqueName: \"kubernetes.io/projected/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-kube-api-access-bhv89\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af33c15c-6386-4fe7-9155-a5c6b6e05ec4-tmp-dir\") pod \"node-resolver-h7pm6\" (UID: \"af33c15c-6386-4fe7-9155-a5c6b6e05ec4\") " pod="openshift-dns/node-resolver-h7pm6" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-kubelet\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-run-systemd\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-ovn-node-metrics-cert\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xh7mr\" (UniqueName: \"kubernetes.io/projected/af33c15c-6386-4fe7-9155-a5c6b6e05ec4-kube-api-access-xh7mr\") pod \"node-resolver-h7pm6\" (UID: \"af33c15c-6386-4fe7-9155-a5c6b6e05ec4\") " pod="openshift-dns/node-resolver-h7pm6" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-run-ovn\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.944951 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-run-ovn-kubernetes\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.944985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-cni-bin\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-etc-openvswitch\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-run-openvswitch\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-env-overrides\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945162 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-ovnkube-script-lib\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945276 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-run-ovn-kubernetes\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945354 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/af33c15c-6386-4fe7-9155-a5c6b6e05ec4-hosts-file\") pod \"node-resolver-h7pm6\" (UID: \"af33c15c-6386-4fe7-9155-a5c6b6e05ec4\") " pod="openshift-dns/node-resolver-h7pm6" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945498 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-env-overrides\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-run-ovn\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945292 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-ovnkube-config\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945613 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-systemd-units\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.945658 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-cni-bin\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.946321 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-node-log\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.946321 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945725 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-run-systemd\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.946321 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945711 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-etc-openvswitch\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.946321 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-slash\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.946321 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945765 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-run-openvswitch\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.946321 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945831 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-host-kubelet\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.946321 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.945969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af33c15c-6386-4fe7-9155-a5c6b6e05ec4-tmp-dir\") pod \"node-resolver-h7pm6\" (UID: \"af33c15c-6386-4fe7-9155-a5c6b6e05ec4\") " pod="openshift-dns/node-resolver-h7pm6" Apr 16 13:57:18.948276 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.948241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-ovn-node-metrics-cert\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.955816 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.955779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhv89\" (UniqueName: \"kubernetes.io/projected/ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff-kube-api-access-bhv89\") pod \"ovnkube-node-w7c7s\" (UID: \"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff\") " pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:18.956646 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:18.956511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh7mr\" (UniqueName: \"kubernetes.io/projected/af33c15c-6386-4fe7-9155-a5c6b6e05ec4-kube-api-access-xh7mr\") pod \"node-resolver-h7pm6\" (UID: \"af33c15c-6386-4fe7-9155-a5c6b6e05ec4\") " pod="openshift-dns/node-resolver-h7pm6" Apr 16 13:57:19.031515 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.031476 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9tz2x" Apr 16 13:57:19.040413 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.040381 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" Apr 16 13:57:19.050437 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.050344 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:19.057026 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.056997 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mj78x" Apr 16 13:57:19.064760 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.064733 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" Apr 16 13:57:19.073657 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.073628 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jzfpd" Apr 16 13:57:19.081405 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.081375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4wgkv" Apr 16 13:57:19.090173 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.090142 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:19.094598 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.094575 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:19.097444 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.097420 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h7pm6" Apr 16 13:57:19.113209 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.113187 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:57:19.348479 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.348441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:19.348644 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:19.348621 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:19.348722 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:19.348708 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs podName:9ab97e35-4539-405d-bb91-f30c906963c2 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:20.348686864 +0000 UTC m=+4.076740570 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs") pod "network-metrics-daemon-l5zhh" (UID: "9ab97e35-4539-405d-bb91-f30c906963c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:19.449178 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.449138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br4fr\" (UniqueName: \"kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr\") pod \"network-check-target-5zjkj\" (UID: \"6e53c142-6f4e-4358-a390-6d3c43558ef6\") " pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:19.449330 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:19.449312 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:19.449384 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:19.449338 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:19.449384 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:19.449355 2575 projected.go:194] Error preparing data for projected volume kube-api-access-br4fr for pod openshift-network-diagnostics/network-check-target-5zjkj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:19.449450 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:19.449413 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr podName:6e53c142-6f4e-4358-a390-6d3c43558ef6 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:20.449398645 +0000 UTC m=+4.177452347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-br4fr" (UniqueName: "kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr") pod "network-check-target-5zjkj" (UID: "6e53c142-6f4e-4358-a390-6d3c43558ef6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:19.784038 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.783989 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:52:17 +0000 UTC" deadline="2027-09-14 07:43:51.690954587 +0000 UTC" Apr 16 13:57:19.784038 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.784038 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12377h46m31.906931027s" Apr 16 13:57:19.855789 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:19.855760 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0675f452_3368_4384_83f9_0c6a166ba947.slice/crio-9e2d1fb6700f98a80e3dc84ceb76025f0fa189df4d311dd7a8a1646839effa69 WatchSource:0}: Error finding container 9e2d1fb6700f98a80e3dc84ceb76025f0fa189df4d311dd7a8a1646839effa69: Status 404 returned error can't find the container with id 9e2d1fb6700f98a80e3dc84ceb76025f0fa189df4d311dd7a8a1646839effa69 Apr 16 13:57:19.857025 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:19.856999 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded8d29e9_b9f2_47f1_9d73_e5c3d0d2cdff.slice/crio-34486492c9e232f8a0015733c70021ee89b81f360827a1705de22e3856ab454e WatchSource:0}: Error finding container 34486492c9e232f8a0015733c70021ee89b81f360827a1705de22e3856ab454e: Status 404 returned error can't find the container with id 34486492c9e232f8a0015733c70021ee89b81f360827a1705de22e3856ab454e Apr 16 13:57:19.858392 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:19.858365 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5837407c_436b_4756_9677_8c40ff8e9059.slice/crio-4a36519f677976ef1cc54e562c41e33933e33717a0d7b58d2175727415c9a76c WatchSource:0}: Error finding container 4a36519f677976ef1cc54e562c41e33933e33717a0d7b58d2175727415c9a76c: Status 404 returned error can't find the container with id 4a36519f677976ef1cc54e562c41e33933e33717a0d7b58d2175727415c9a76c Apr 16 13:57:19.860075 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:19.860050 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e94555a_67da_4de5_ace1_024c7384ada8.slice/crio-c9c2830cb2375ad7c7956f41a4e56e28f439b33969de6ba9af863cbb8a0a856a WatchSource:0}: Error finding container c9c2830cb2375ad7c7956f41a4e56e28f439b33969de6ba9af863cbb8a0a856a: Status 404 returned error can't find the container with id c9c2830cb2375ad7c7956f41a4e56e28f439b33969de6ba9af863cbb8a0a856a Apr 16 13:57:19.860320 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.860225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" event={"ID":"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff","Type":"ContainerStarted","Data":"34486492c9e232f8a0015733c70021ee89b81f360827a1705de22e3856ab454e"} Apr 16 13:57:19.861589 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:19.861560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4wgkv" event={"ID":"0675f452-3368-4384-83f9-0c6a166ba947","Type":"ContainerStarted","Data":"9e2d1fb6700f98a80e3dc84ceb76025f0fa189df4d311dd7a8a1646839effa69"} Apr 16 13:57:19.863613 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:19.863295 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d11af49_8358_49ac_ac63_a39ae3da3f3c.slice/crio-dd02f54aeb88fdaca1de298c18e4f92a67fa10c948eb6a4f3c1573e4debe718f WatchSource:0}: Error finding container dd02f54aeb88fdaca1de298c18e4f92a67fa10c948eb6a4f3c1573e4debe718f: Status 404 returned error can't find the container with id dd02f54aeb88fdaca1de298c18e4f92a67fa10c948eb6a4f3c1573e4debe718f Apr 16 13:57:19.863925 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:19.863867 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeea23aa3_1d75_42db_b520_0bce4b998c9c.slice/crio-49843a19c77c34ebbd29604be3b51c455d433044a5320bff5678900a57b8c640 WatchSource:0}: Error finding container 49843a19c77c34ebbd29604be3b51c455d433044a5320bff5678900a57b8c640: Status 404 returned error can't find the container with id 49843a19c77c34ebbd29604be3b51c455d433044a5320bff5678900a57b8c640 Apr 16 13:57:19.864694 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:19.864589 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf33c15c_6386_4fe7_9155_a5c6b6e05ec4.slice/crio-9d75f7cbb5fb0422a659221266b09a49aa12c679ce24f6aaa4a8ed3b859da998 WatchSource:0}: Error finding container 9d75f7cbb5fb0422a659221266b09a49aa12c679ce24f6aaa4a8ed3b859da998: Status 404 returned error can't find the container with id 9d75f7cbb5fb0422a659221266b09a49aa12c679ce24f6aaa4a8ed3b859da998 Apr 16 13:57:19.866416 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:19.866263 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03875a5c_704e_42af_9ea6_ba5a9f181d94.slice/crio-09a56e15f93adb78ed4cc4337b366ae925a2f93f00c1c33c32a753e1d58b84b2 WatchSource:0}: Error finding container 09a56e15f93adb78ed4cc4337b366ae925a2f93f00c1c33c32a753e1d58b84b2: Status 404 returned error can't find the container with id 09a56e15f93adb78ed4cc4337b366ae925a2f93f00c1c33c32a753e1d58b84b2 Apr 16 13:57:19.867538 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:57:19.867509 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod985ce2bb_b7a7_4b0c_893a_1840235f7653.slice/crio-d60cf79e92739c165d352c7810b993ecc8a28e3527d985240666927336b1e7bd WatchSource:0}: Error finding container d60cf79e92739c165d352c7810b993ecc8a28e3527d985240666927336b1e7bd: Status 404 returned error can't find the container with id d60cf79e92739c165d352c7810b993ecc8a28e3527d985240666927336b1e7bd Apr 16 13:57:20.354795 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.354732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:20.354976 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:20.354874 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:20.354976 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:20.354954 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs podName:9ab97e35-4539-405d-bb91-f30c906963c2 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:22.354937614 +0000 UTC m=+6.082991317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs") pod "network-metrics-daemon-l5zhh" (UID: "9ab97e35-4539-405d-bb91-f30c906963c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:20.455769 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.455714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br4fr\" (UniqueName: \"kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr\") pod \"network-check-target-5zjkj\" (UID: \"6e53c142-6f4e-4358-a390-6d3c43558ef6\") " pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:20.455951 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:20.455859 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:20.455951 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:20.455905 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:20.455951 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:20.455918 2575 projected.go:194] Error preparing data for projected volume kube-api-access-br4fr for pod openshift-network-diagnostics/network-check-target-5zjkj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:20.456106 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:20.455973 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr podName:6e53c142-6f4e-4358-a390-6d3c43558ef6 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:22.455953297 +0000 UTC m=+6.184007000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-br4fr" (UniqueName: "kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr") pod "network-check-target-5zjkj" (UID: "6e53c142-6f4e-4358-a390-6d3c43558ef6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:20.850935 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.850874 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:20.851404 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:20.851033 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:20.851733 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.851578 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:20.851733 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:20.851685 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:20.865087 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.865056 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj78x" event={"ID":"9d11af49-8358-49ac-ac63-a39ae3da3f3c","Type":"ContainerStarted","Data":"dd02f54aeb88fdaca1de298c18e4f92a67fa10c948eb6a4f3c1573e4debe718f"} Apr 16 13:57:20.867750 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.867718 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6r2ft" event={"ID":"7e94555a-67da-4de5-ace1-024c7384ada8","Type":"ContainerStarted","Data":"c9c2830cb2375ad7c7956f41a4e56e28f439b33969de6ba9af863cbb8a0a856a"} Apr 16 13:57:20.869713 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.869684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jzfpd" event={"ID":"5837407c-436b-4756-9677-8c40ff8e9059","Type":"ContainerStarted","Data":"4a36519f677976ef1cc54e562c41e33933e33717a0d7b58d2175727415c9a76c"} Apr 16 13:57:20.873578 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.873530 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-61.ec2.internal" event={"ID":"aeb1bbb91159af60fd67f2df28938806","Type":"ContainerStarted","Data":"df213cfdf9bf08d7be857fa8d041d777a92ee957fa86a70fb36300c74791e03a"} Apr 16 13:57:20.875917 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.875883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9tz2x" event={"ID":"03875a5c-704e-42af-9ea6-ba5a9f181d94","Type":"ContainerStarted","Data":"09a56e15f93adb78ed4cc4337b366ae925a2f93f00c1c33c32a753e1d58b84b2"} Apr 16 13:57:20.877026 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.877004 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h7pm6" event={"ID":"af33c15c-6386-4fe7-9155-a5c6b6e05ec4","Type":"ContainerStarted","Data":"9d75f7cbb5fb0422a659221266b09a49aa12c679ce24f6aaa4a8ed3b859da998"} Apr 16 13:57:20.879787 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.879766 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" event={"ID":"eea23aa3-1d75-42db-b520-0bce4b998c9c","Type":"ContainerStarted","Data":"49843a19c77c34ebbd29604be3b51c455d433044a5320bff5678900a57b8c640"} Apr 16 13:57:20.881733 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.881707 2575 generic.go:358] "Generic (PLEG): container finished" podID="162c9ac6c39f8264db22914a3448ec16" containerID="8def03a17c86d3bc27bb28c147d1ea2d4ed1049ffbd81cd7570aa71cbd4f7c6c" exitCode=0 Apr 16 13:57:20.881825 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.881775 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" event={"ID":"162c9ac6c39f8264db22914a3448ec16","Type":"ContainerDied","Data":"8def03a17c86d3bc27bb28c147d1ea2d4ed1049ffbd81cd7570aa71cbd4f7c6c"} Apr 16 13:57:20.884247 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.884227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" event={"ID":"985ce2bb-b7a7-4b0c-893a-1840235f7653","Type":"ContainerStarted","Data":"d60cf79e92739c165d352c7810b993ecc8a28e3527d985240666927336b1e7bd"} Apr 16 13:57:20.888935 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:20.888375 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-61.ec2.internal" podStartSLOduration=3.888361531 podStartE2EDuration="3.888361531s" podCreationTimestamp="2026-04-16 13:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:57:20.887751554 +0000 UTC m=+4.615805281" watchObservedRunningTime="2026-04-16 13:57:20.888361531 +0000 UTC m=+4.616415258" Apr 16 13:57:21.892769 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:21.892734 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" event={"ID":"162c9ac6c39f8264db22914a3448ec16","Type":"ContainerStarted","Data":"4008467e7c688cac6c471f43c9bd5fb9d290b031e50d6763967885f81769bc99"} Apr 16 13:57:22.372163 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:22.372123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:22.372389 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:22.372286 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:22.372478 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:22.372467 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs podName:9ab97e35-4539-405d-bb91-f30c906963c2 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:26.372443829 +0000 UTC m=+10.100497544 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs") pod "network-metrics-daemon-l5zhh" (UID: "9ab97e35-4539-405d-bb91-f30c906963c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:22.472640 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:22.472599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br4fr\" (UniqueName: \"kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr\") pod \"network-check-target-5zjkj\" (UID: \"6e53c142-6f4e-4358-a390-6d3c43558ef6\") " pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:22.472832 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:22.472777 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:22.472832 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:22.472798 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:22.472832 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:22.472813 2575 projected.go:194] Error preparing data for projected volume kube-api-access-br4fr for pod openshift-network-diagnostics/network-check-target-5zjkj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:22.473035 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:22.472879 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr podName:6e53c142-6f4e-4358-a390-6d3c43558ef6 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:26.472857778 +0000 UTC m=+10.200911481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-br4fr" (UniqueName: "kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr") pod "network-check-target-5zjkj" (UID: "6e53c142-6f4e-4358-a390-6d3c43558ef6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:22.854777 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:22.854749 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:22.854987 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:22.854870 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:22.854987 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:22.854967 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:22.855109 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:22.855053 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:24.850954 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:24.850912 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:24.850954 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:24.850955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:24.851492 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:24.851057 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:24.851559 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:24.851514 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:26.408137 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:26.408095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:26.408610 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:26.408257 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:26.408610 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:26.408335 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs podName:9ab97e35-4539-405d-bb91-f30c906963c2 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:34.40831101 +0000 UTC m=+18.136364714 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs") pod "network-metrics-daemon-l5zhh" (UID: "9ab97e35-4539-405d-bb91-f30c906963c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:26.508923 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:26.508863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br4fr\" (UniqueName: \"kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr\") pod \"network-check-target-5zjkj\" (UID: \"6e53c142-6f4e-4358-a390-6d3c43558ef6\") " pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:26.509113 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:26.509060 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:26.509113 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:26.509089 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:26.509113 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:26.509103 2575 projected.go:194] Error preparing data for projected volume kube-api-access-br4fr for pod openshift-network-diagnostics/network-check-target-5zjkj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:26.509260 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:26.509168 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr podName:6e53c142-6f4e-4358-a390-6d3c43558ef6 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:34.509148298 +0000 UTC m=+18.237202001 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-br4fr" (UniqueName: "kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr") pod "network-check-target-5zjkj" (UID: "6e53c142-6f4e-4358-a390-6d3c43558ef6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:26.852306 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:26.851698 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:26.852306 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:26.851820 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:26.852306 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:26.852217 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:26.852306 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:26.852308 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:28.853665 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:28.853630 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:28.854123 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:28.853690 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:28.854123 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:28.853789 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:28.854123 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:28.853931 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:30.853585 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:30.853557 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:30.854117 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:30.853564 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:30.854117 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:30.853677 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:30.854117 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:30.853761 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:32.852751 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:32.852717 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:32.853226 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:32.852717 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:32.853226 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:32.852840 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:32.853226 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:32.852951 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:34.468603 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:34.468563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:34.469043 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:34.468750 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:34.469043 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:34.468822 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs podName:9ab97e35-4539-405d-bb91-f30c906963c2 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:50.468803318 +0000 UTC m=+34.196857024 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs") pod "network-metrics-daemon-l5zhh" (UID: "9ab97e35-4539-405d-bb91-f30c906963c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:34.569723 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:34.569683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br4fr\" (UniqueName: \"kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr\") pod \"network-check-target-5zjkj\" (UID: \"6e53c142-6f4e-4358-a390-6d3c43558ef6\") " pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:34.569940 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:34.569867 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:34.569940 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:34.569908 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:34.569940 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:34.569922 2575 projected.go:194] Error preparing data for projected volume kube-api-access-br4fr for pod openshift-network-diagnostics/network-check-target-5zjkj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:34.570107 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:34.569990 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr podName:6e53c142-6f4e-4358-a390-6d3c43558ef6 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:50.569968927 +0000 UTC m=+34.298022637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-br4fr" (UniqueName: "kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr") pod "network-check-target-5zjkj" (UID: "6e53c142-6f4e-4358-a390-6d3c43558ef6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:34.850527 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:34.850488 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:34.850709 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:34.850488 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:34.850709 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:34.850623 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:34.850839 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:34.850728 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:36.852377 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:36.852347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:36.852748 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:36.852345 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:36.852748 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:36.852453 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:36.852748 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:36.852503 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:37.922280 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.921935 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 13:57:37.922996 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.922498 2575 generic.go:358] "Generic (PLEG): container finished" podID="ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff" containerID="d612b4dcc44e69e1fd6fcd8686b25057fd38bf29e5ad1a4eef08d709ff92489f" exitCode=1 Apr 16 13:57:37.922996 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.922565 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" event={"ID":"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff","Type":"ContainerStarted","Data":"7839af3c112619950f8069c6c8ac40cc59943e6216c5cdd8b140df2765086242"} Apr 16 13:57:37.922996 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.922601 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" event={"ID":"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff","Type":"ContainerStarted","Data":"2a4662db3da008dff0219f7712a2cab4772ae19c9da77a03f43c30f9974d0eb7"} Apr 16 13:57:37.922996 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.922611 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" event={"ID":"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff","Type":"ContainerStarted","Data":"684e0e53ee42172d40701d735d775d915d0224cecd40cfe77785ee6234b7893a"} Apr 16 13:57:37.922996 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.922619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" event={"ID":"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff","Type":"ContainerStarted","Data":"7424648b5b8b751740575124fa553dbcfe4dc497e218efac28bfb08e1b83c426"} Apr 16 13:57:37.922996 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.922641 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" event={"ID":"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff","Type":"ContainerDied","Data":"d612b4dcc44e69e1fd6fcd8686b25057fd38bf29e5ad1a4eef08d709ff92489f"} Apr 16 13:57:37.922996 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.922667 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" event={"ID":"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff","Type":"ContainerStarted","Data":"ea706b1bdfad79b91f6058c195821bc03c24a8fd6cc3108efc86b6243a5b5e44"} Apr 16 13:57:37.923874 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.923850 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4wgkv" event={"ID":"0675f452-3368-4384-83f9-0c6a166ba947","Type":"ContainerStarted","Data":"0ded07b31c9e93ff19cb06053384c48873306cbbe82535986f7a80c9d6cb32e4"} Apr 16 13:57:37.925268 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.925240 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" event={"ID":"985ce2bb-b7a7-4b0c-893a-1840235f7653","Type":"ContainerStarted","Data":"f3a3c41c0fbb1a72dbfc8a2d86e4dd49ad39b1be81a60fb08d782ec5b1900543"} Apr 16 13:57:37.926755 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.926723 2575 generic.go:358] "Generic (PLEG): container finished" podID="9d11af49-8358-49ac-ac63-a39ae3da3f3c" containerID="57553b269e6d944b361ecfeb60015d426c07210703e669d80d7e1582e82134c1" exitCode=0 Apr 16 13:57:37.926859 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.926800 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj78x" event={"ID":"9d11af49-8358-49ac-ac63-a39ae3da3f3c","Type":"ContainerDied","Data":"57553b269e6d944b361ecfeb60015d426c07210703e669d80d7e1582e82134c1"} Apr 16 13:57:37.928388 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.928347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6r2ft" event={"ID":"7e94555a-67da-4de5-ace1-024c7384ada8","Type":"ContainerStarted","Data":"0f2cfcb0bca66651a7592d5252af4d844d184c466c8ae96978dc070ce52ac9f9"} Apr 16 13:57:37.929798 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.929765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9tz2x" event={"ID":"03875a5c-704e-42af-9ea6-ba5a9f181d94","Type":"ContainerStarted","Data":"07e70fc2eff9e5c043f5b74c38877a498e39e3af28408eac775b83324575236b"} Apr 16 13:57:37.931132 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.931108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h7pm6" event={"ID":"af33c15c-6386-4fe7-9155-a5c6b6e05ec4","Type":"ContainerStarted","Data":"4af3a8d7f5f5d94851f8d2129cff0ce8944c0f448eb93754a2e9587f6fb2d1d5"} Apr 16 13:57:37.932406 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.932377 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" event={"ID":"eea23aa3-1d75-42db-b520-0bce4b998c9c","Type":"ContainerStarted","Data":"a2bc7a0ed721aace378dd929f20f19c5fc87f01253c3263a22e3909ee6aae8e2"} Apr 16 13:57:37.940333 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.940296 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-61.ec2.internal" podStartSLOduration=20.94028668 podStartE2EDuration="20.94028668s" podCreationTimestamp="2026-04-16 13:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:57:21.909306665 +0000 UTC m=+5.637360392" watchObservedRunningTime="2026-04-16 13:57:37.94028668 +0000 UTC m=+21.668340452" Apr 16 13:57:37.940631 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.940608 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4wgkv" podStartSLOduration=3.857450216 podStartE2EDuration="20.940602635s" podCreationTimestamp="2026-04-16 13:57:17 +0000 UTC" firstStartedPulling="2026-04-16 13:57:19.85766679 +0000 UTC m=+3.585720498" lastFinishedPulling="2026-04-16 13:57:36.9408192 +0000 UTC m=+20.668872917" observedRunningTime="2026-04-16 13:57:37.940156844 +0000 UTC m=+21.668210569" watchObservedRunningTime="2026-04-16 13:57:37.940602635 +0000 UTC m=+21.668656372" Apr 16 13:57:37.960121 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:37.960066 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-z8t6x" podStartSLOduration=4.966172188 podStartE2EDuration="21.960044588s" podCreationTimestamp="2026-04-16 13:57:16 +0000 UTC" firstStartedPulling="2026-04-16 13:57:19.881476355 +0000 UTC m=+3.609530071" lastFinishedPulling="2026-04-16 13:57:36.875348762 +0000 UTC m=+20.603402471" observedRunningTime="2026-04-16 13:57:37.959481789 +0000 UTC m=+21.687535530" watchObservedRunningTime="2026-04-16 13:57:37.960044588 +0000 UTC m=+21.688098313" Apr 16 13:57:38.028965 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:38.028887 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9tz2x" podStartSLOduration=5.035077074 podStartE2EDuration="22.028868158s" podCreationTimestamp="2026-04-16 13:57:16 +0000 UTC" firstStartedPulling="2026-04-16 13:57:19.881533087 +0000 UTC m=+3.609586796" lastFinishedPulling="2026-04-16 13:57:36.87532416 +0000 UTC m=+20.603377880" observedRunningTime="2026-04-16 13:57:38.028827118 +0000 UTC m=+21.756880843" watchObservedRunningTime="2026-04-16 13:57:38.028868158 +0000 UTC m=+21.756921865" Apr 16 13:57:38.046261 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:38.046197 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6r2ft" podStartSLOduration=9.591094132 podStartE2EDuration="22.04617638s" podCreationTimestamp="2026-04-16 13:57:16 +0000 UTC" firstStartedPulling="2026-04-16 13:57:19.861750997 +0000 UTC m=+3.589804703" lastFinishedPulling="2026-04-16 13:57:32.316833232 +0000 UTC m=+16.044886951" observedRunningTime="2026-04-16 13:57:38.045672071 +0000 UTC m=+21.773725804" watchObservedRunningTime="2026-04-16 13:57:38.04617638 +0000 UTC m=+21.774230106" Apr 16 13:57:38.474911 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:38.474730 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:57:38.804075 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:38.803961 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:57:38.474885061Z","UUID":"1ad5cf3a-c597-4280-860f-5f5dfbe042ce","Handler":null,"Name":"","Endpoint":""} Apr 16 13:57:38.805782 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:38.805759 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:57:38.805908 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:38.805789 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:57:38.853616 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:38.853585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:38.853616 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:38.853599 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:38.853852 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:38.853686 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:38.853920 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:38.853839 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:38.936126 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:38.936091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jzfpd" event={"ID":"5837407c-436b-4756-9677-8c40ff8e9059","Type":"ContainerStarted","Data":"65650e30d748c11b59439d48c99eda1e34d4547afb702269b6bcf72b01e33792"} Apr 16 13:57:38.937858 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:38.937824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" event={"ID":"eea23aa3-1d75-42db-b520-0bce4b998c9c","Type":"ContainerStarted","Data":"ab0cacccfb754f001f61f2d0f01bd30c967c73a0e8ed279e1666b5928d0d9808"} Apr 16 13:57:38.951335 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:38.951282 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h7pm6" podStartSLOduration=4.931302898 podStartE2EDuration="21.951267536s" podCreationTimestamp="2026-04-16 13:57:17 +0000 UTC" firstStartedPulling="2026-04-16 13:57:19.881594655 +0000 UTC m=+3.609648373" lastFinishedPulling="2026-04-16 13:57:36.901559308 +0000 UTC m=+20.629613011" observedRunningTime="2026-04-16 13:57:38.064591294 +0000 UTC m=+21.792645010" watchObservedRunningTime="2026-04-16 13:57:38.951267536 +0000 UTC m=+22.679321261" Apr 16 13:57:40.853618 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:40.853583 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:40.854263 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:40.853592 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:40.854263 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:40.853695 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:40.854263 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:40.853802 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:40.943792 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:40.943759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" event={"ID":"eea23aa3-1d75-42db-b520-0bce4b998c9c","Type":"ContainerStarted","Data":"a7fc794216ee6cfe62fdcbdb50d93bc444fd14de484c1fae692f3f7ce3ffc798"} Apr 16 13:57:40.947778 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:40.947289 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 13:57:40.947946 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:40.947914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" event={"ID":"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff","Type":"ContainerStarted","Data":"01795088db9b14e5ee511cdb1ad39719afdf53a4fee782bcc041deeebee3a2b8"} Apr 16 13:57:40.962570 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:40.962524 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jzfpd" podStartSLOduration=7.94817022 podStartE2EDuration="24.96251056s" podCreationTimestamp="2026-04-16 13:57:16 +0000 UTC" firstStartedPulling="2026-04-16 13:57:19.861048307 +0000 UTC m=+3.589102011" lastFinishedPulling="2026-04-16 13:57:36.875388645 +0000 UTC m=+20.603442351" observedRunningTime="2026-04-16 13:57:38.95157276 +0000 UTC m=+22.679626486" watchObservedRunningTime="2026-04-16 13:57:40.96251056 +0000 UTC m=+24.690564329" Apr 16 13:57:40.962725 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:40.962596 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fnk" podStartSLOduration=4.8032028570000005 podStartE2EDuration="24.962591716s" podCreationTimestamp="2026-04-16 13:57:16 +0000 UTC" firstStartedPulling="2026-04-16 13:57:19.866015614 +0000 UTC m=+3.594069332" lastFinishedPulling="2026-04-16 13:57:40.025404484 +0000 UTC m=+23.753458191" observedRunningTime="2026-04-16 13:57:40.961995011 +0000 UTC m=+24.690048739" watchObservedRunningTime="2026-04-16 13:57:40.962591716 +0000 UTC m=+24.690645441" Apr 16 13:57:41.926222 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:41.926182 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:41.927160 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:41.927140 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:41.950435 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:41.950378 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:41.950786 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:41.950770 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6r2ft" Apr 16 13:57:42.852909 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:42.852715 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:42.853078 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:42.852715 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:42.853078 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:42.852997 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:42.853078 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:42.853052 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:42.954532 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:42.954506 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 13:57:42.955086 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:42.954842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" event={"ID":"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff","Type":"ContainerStarted","Data":"494e3d0a4afec1f055a5195b76a6005781e45a2229a82ca9a66fc46396bc810b"} Apr 16 13:57:42.955205 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:42.955182 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:42.955432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:42.955413 2575 scope.go:117] "RemoveContainer" containerID="d612b4dcc44e69e1fd6fcd8686b25057fd38bf29e5ad1a4eef08d709ff92489f" Apr 16 13:57:42.956571 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:42.956540 2575 generic.go:358] "Generic (PLEG): container finished" podID="9d11af49-8358-49ac-ac63-a39ae3da3f3c" containerID="e76da007a8f0a301b6caa60bbfc5952a39ea84adcb29fce2242d50d0ae5bc8c3" exitCode=0 Apr 16 13:57:42.956665 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:42.956629 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj78x" event={"ID":"9d11af49-8358-49ac-ac63-a39ae3da3f3c","Type":"ContainerDied","Data":"e76da007a8f0a301b6caa60bbfc5952a39ea84adcb29fce2242d50d0ae5bc8c3"} Apr 16 13:57:42.971235 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:42.971216 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:43.963697 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:43.963670 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 13:57:43.964166 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:43.964122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" event={"ID":"ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff","Type":"ContainerStarted","Data":"7f9a4acc05b29511f1858410e9e592671abfc99bc5a95b32363d102eddf5d994"} Apr 16 13:57:43.964632 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:43.964607 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:43.964715 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:43.964658 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:43.969434 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:43.969121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj78x" event={"ID":"9d11af49-8358-49ac-ac63-a39ae3da3f3c","Type":"ContainerStarted","Data":"d2f694740e41b4462933923f43c5d685562a433f81f1ec88a2a52e78d8715b27"} Apr 16 13:57:43.982621 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:43.982594 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:57:43.990162 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:43.990109 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" podStartSLOduration=9.851539928 podStartE2EDuration="26.990089867s" podCreationTimestamp="2026-04-16 13:57:17 +0000 UTC" firstStartedPulling="2026-04-16 13:57:19.859620838 +0000 UTC m=+3.587674554" lastFinishedPulling="2026-04-16 13:57:36.998170787 +0000 UTC m=+20.726224493" observedRunningTime="2026-04-16 13:57:43.988635638 +0000 UTC m=+27.716689398" watchObservedRunningTime="2026-04-16 13:57:43.990089867 +0000 UTC m=+27.718143598" Apr 16 13:57:44.211946 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:44.211249 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5zjkj"] Apr 16 13:57:44.211946 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:44.211633 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:44.211946 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:44.211769 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:44.212667 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:44.212642 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l5zhh"] Apr 16 13:57:44.213340 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:44.213128 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:44.213340 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:44.213243 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:44.973095 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:44.973059 2575 generic.go:358] "Generic (PLEG): container finished" podID="9d11af49-8358-49ac-ac63-a39ae3da3f3c" containerID="d2f694740e41b4462933923f43c5d685562a433f81f1ec88a2a52e78d8715b27" exitCode=0 Apr 16 13:57:44.973543 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:44.973138 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj78x" event={"ID":"9d11af49-8358-49ac-ac63-a39ae3da3f3c","Type":"ContainerDied","Data":"d2f694740e41b4462933923f43c5d685562a433f81f1ec88a2a52e78d8715b27"} Apr 16 13:57:45.850270 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:45.850231 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:45.850467 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:45.850231 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:45.850467 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:45.850346 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:45.850467 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:45.850446 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:45.976375 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:45.976295 2575 generic.go:358] "Generic (PLEG): container finished" podID="9d11af49-8358-49ac-ac63-a39ae3da3f3c" containerID="26ac05c9d558eead205fced4e238c5a7fcc168d1d91ed7f57bb3d84ff75e76aa" exitCode=0 Apr 16 13:57:45.976833 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:45.976380 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj78x" event={"ID":"9d11af49-8358-49ac-ac63-a39ae3da3f3c","Type":"ContainerDied","Data":"26ac05c9d558eead205fced4e238c5a7fcc168d1d91ed7f57bb3d84ff75e76aa"} Apr 16 13:57:47.850886 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:47.850844 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:47.851558 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:47.850844 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:47.851558 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:47.851006 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:47.851558 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:47.851066 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:49.850112 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:49.850084 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:49.850698 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:49.850161 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:49.850698 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:49.850323 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:57:49.850698 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:49.850470 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5zjkj" podUID="6e53c142-6f4e-4358-a390-6d3c43558ef6" Apr 16 13:57:50.133594 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.133519 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-61.ec2.internal" event="NodeReady" Apr 16 13:57:50.133747 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.133653 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:57:50.175194 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.175146 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6fksw"] Apr 16 13:57:50.181064 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.181029 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.181668 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.181627 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xdwfc"] Apr 16 13:57:50.183307 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.183286 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:57:50.183765 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.183623 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ftzrr\"" Apr 16 13:57:50.183765 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.183639 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:57:50.185293 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.185274 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:57:50.187233 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.187183 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:57:50.187368 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.187247 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:57:50.187368 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.187322 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mv45m\"" Apr 16 13:57:50.187513 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.187420 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:57:50.190412 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.190391 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6fksw"] Apr 16 13:57:50.192403 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.192381 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xdwfc"] Apr 16 13:57:50.285105 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.285071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:57:50.285105 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.285109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54v56\" (UniqueName: \"kubernetes.io/projected/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-kube-api-access-54v56\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:57:50.285329 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.285137 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69050bd1-ea1f-49d8-8ca8-617d41938670-tmp-dir\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.285329 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.285167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69050bd1-ea1f-49d8-8ca8-617d41938670-config-volume\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.285329 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.285295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.285499 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.285335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmzkf\" (UniqueName: \"kubernetes.io/projected/69050bd1-ea1f-49d8-8ca8-617d41938670-kube-api-access-vmzkf\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.386416 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.386321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69050bd1-ea1f-49d8-8ca8-617d41938670-config-volume\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.386416 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.386391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.386649 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.386419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmzkf\" (UniqueName: \"kubernetes.io/projected/69050bd1-ea1f-49d8-8ca8-617d41938670-kube-api-access-vmzkf\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.386649 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.386553 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:57:50.386649 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.386584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:57:50.386649 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.386625 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls podName:69050bd1-ea1f-49d8-8ca8-617d41938670 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:50.886602408 +0000 UTC m=+34.614656116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls") pod "dns-default-6fksw" (UID: "69050bd1-ea1f-49d8-8ca8-617d41938670") : secret "dns-default-metrics-tls" not found Apr 16 13:57:50.386883 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.386667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54v56\" (UniqueName: \"kubernetes.io/projected/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-kube-api-access-54v56\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:57:50.386883 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.386674 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:57:50.386883 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.386702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69050bd1-ea1f-49d8-8ca8-617d41938670-tmp-dir\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.386883 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.386728 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert podName:adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d nodeName:}" failed. No retries permitted until 2026-04-16 13:57:50.886709115 +0000 UTC m=+34.614762834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert") pod "ingress-canary-xdwfc" (UID: "adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d") : secret "canary-serving-cert" not found Apr 16 13:57:50.387087 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.387013 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69050bd1-ea1f-49d8-8ca8-617d41938670-config-volume\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.387543 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.387511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69050bd1-ea1f-49d8-8ca8-617d41938670-tmp-dir\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.398541 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.398383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmzkf\" (UniqueName: \"kubernetes.io/projected/69050bd1-ea1f-49d8-8ca8-617d41938670-kube-api-access-vmzkf\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.398705 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.398468 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54v56\" (UniqueName: \"kubernetes.io/projected/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-kube-api-access-54v56\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:57:50.487431 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.487391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:50.487618 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.487573 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:50.487685 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.487673 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs podName:9ab97e35-4539-405d-bb91-f30c906963c2 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:22.487651017 +0000 UTC m=+66.215704750 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs") pod "network-metrics-daemon-l5zhh" (UID: "9ab97e35-4539-405d-bb91-f30c906963c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:57:50.588034 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.587992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br4fr\" (UniqueName: \"kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr\") pod \"network-check-target-5zjkj\" (UID: \"6e53c142-6f4e-4358-a390-6d3c43558ef6\") " pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:50.588231 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.588187 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:57:50.588231 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.588214 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:57:50.588231 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.588226 2575 projected.go:194] Error preparing data for projected volume kube-api-access-br4fr for pod openshift-network-diagnostics/network-check-target-5zjkj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:50.588378 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.588304 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr podName:6e53c142-6f4e-4358-a390-6d3c43558ef6 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:22.588283896 +0000 UTC m=+66.316337620 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-br4fr" (UniqueName: "kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr") pod "network-check-target-5zjkj" (UID: "6e53c142-6f4e-4358-a390-6d3c43558ef6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:57:50.890912 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.890861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:57:50.891355 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:50.890973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:50.891355 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.891026 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:57:50.891355 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.891088 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:57:50.891355 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.891099 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert podName:adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d nodeName:}" failed. No retries permitted until 2026-04-16 13:57:51.891078323 +0000 UTC m=+35.619132028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert") pod "ingress-canary-xdwfc" (UID: "adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d") : secret "canary-serving-cert" not found Apr 16 13:57:50.891355 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:50.891138 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls podName:69050bd1-ea1f-49d8-8ca8-617d41938670 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:51.89112135 +0000 UTC m=+35.619175057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls") pod "dns-default-6fksw" (UID: "69050bd1-ea1f-49d8-8ca8-617d41938670") : secret "dns-default-metrics-tls" not found Apr 16 13:57:51.850499 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:51.850458 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:57:51.850695 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:51.850466 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:57:51.853273 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:51.853242 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:57:51.854018 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:51.854001 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:57:51.854122 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:51.854088 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7n6t4\"" Apr 16 13:57:51.854206 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:51.854191 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jz268\"" Apr 16 13:57:51.854267 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:51.854257 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:57:51.899660 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:51.899628 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:57:51.900117 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:51.899697 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:51.900117 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:51.899803 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:57:51.900117 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:51.899865 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert podName:adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d nodeName:}" failed. No retries permitted until 2026-04-16 13:57:53.899843646 +0000 UTC m=+37.627897350 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert") pod "ingress-canary-xdwfc" (UID: "adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d") : secret "canary-serving-cert" not found Apr 16 13:57:51.900117 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:51.899807 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:57:51.900117 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:51.899921 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls podName:69050bd1-ea1f-49d8-8ca8-617d41938670 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:53.899911826 +0000 UTC m=+37.627965529 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls") pod "dns-default-6fksw" (UID: "69050bd1-ea1f-49d8-8ca8-617d41938670") : secret "dns-default-metrics-tls" not found Apr 16 13:57:52.992152 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:52.992117 2575 generic.go:358] "Generic (PLEG): container finished" podID="9d11af49-8358-49ac-ac63-a39ae3da3f3c" containerID="a430cd87ce35f0f7d0cc67d0998c95be681b5041e35436142b55488ae30be404" exitCode=0 Apr 16 13:57:52.992520 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:52.992171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj78x" event={"ID":"9d11af49-8358-49ac-ac63-a39ae3da3f3c","Type":"ContainerDied","Data":"a430cd87ce35f0f7d0cc67d0998c95be681b5041e35436142b55488ae30be404"} Apr 16 13:57:53.911874 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:53.911835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:57:53.912041 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:53.911922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:53.912041 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:53.912005 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:57:53.912113 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:53.912058 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls podName:69050bd1-ea1f-49d8-8ca8-617d41938670 nodeName:}" failed. No retries permitted until 2026-04-16 13:57:57.912043835 +0000 UTC m=+41.640097543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls") pod "dns-default-6fksw" (UID: "69050bd1-ea1f-49d8-8ca8-617d41938670") : secret "dns-default-metrics-tls" not found Apr 16 13:57:53.912113 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:53.912005 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:57:53.912184 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:53.912126 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert podName:adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d nodeName:}" failed. No retries permitted until 2026-04-16 13:57:57.912114098 +0000 UTC m=+41.640167815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert") pod "ingress-canary-xdwfc" (UID: "adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d") : secret "canary-serving-cert" not found Apr 16 13:57:53.996852 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:53.996764 2575 generic.go:358] "Generic (PLEG): container finished" podID="9d11af49-8358-49ac-ac63-a39ae3da3f3c" containerID="0ada18ecf840e02543333fb2dfe2a27fc5466bdc5420335e2c07809324cb7159" exitCode=0 Apr 16 13:57:53.997219 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:53.996838 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj78x" event={"ID":"9d11af49-8358-49ac-ac63-a39ae3da3f3c","Type":"ContainerDied","Data":"0ada18ecf840e02543333fb2dfe2a27fc5466bdc5420335e2c07809324cb7159"} Apr 16 13:57:55.001143 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:55.001111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj78x" event={"ID":"9d11af49-8358-49ac-ac63-a39ae3da3f3c","Type":"ContainerStarted","Data":"c58fb9fffd1a3e2a85fc9257111f99ade259d5e538fc683a25f45f71f3402ae4"} Apr 16 13:57:55.022954 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:55.022886 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mj78x" podStartSLOduration=6.289833455 podStartE2EDuration="39.022871839s" podCreationTimestamp="2026-04-16 13:57:16 +0000 UTC" firstStartedPulling="2026-04-16 13:57:19.865132274 +0000 UTC m=+3.593185977" lastFinishedPulling="2026-04-16 13:57:52.598170655 +0000 UTC m=+36.326224361" observedRunningTime="2026-04-16 13:57:55.02175074 +0000 UTC m=+38.749804465" watchObservedRunningTime="2026-04-16 13:57:55.022871839 +0000 UTC m=+38.750925572" Apr 16 13:57:57.941851 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:57.941812 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:57:57.942241 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:57:57.941876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:57:57.942241 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:57.942007 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:57:57.942241 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:57.942080 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert podName:adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d nodeName:}" failed. No retries permitted until 2026-04-16 13:58:05.942062755 +0000 UTC m=+49.670116458 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert") pod "ingress-canary-xdwfc" (UID: "adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d") : secret "canary-serving-cert" not found Apr 16 13:57:57.942241 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:57.942008 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:57:57.942241 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:57:57.942162 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls podName:69050bd1-ea1f-49d8-8ca8-617d41938670 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:05.942149267 +0000 UTC m=+49.670202976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls") pod "dns-default-6fksw" (UID: "69050bd1-ea1f-49d8-8ca8-617d41938670") : secret "dns-default-metrics-tls" not found Apr 16 13:58:06.002478 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:06.002431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:58:06.003058 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:06.002495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:58:06.003058 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:06.002586 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:58:06.003058 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:06.002592 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:58:06.003058 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:06.002649 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls podName:69050bd1-ea1f-49d8-8ca8-617d41938670 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:22.002633372 +0000 UTC m=+65.730687075 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls") pod "dns-default-6fksw" (UID: "69050bd1-ea1f-49d8-8ca8-617d41938670") : secret "dns-default-metrics-tls" not found Apr 16 13:58:06.003058 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:06.002662 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert podName:adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d nodeName:}" failed. No retries permitted until 2026-04-16 13:58:22.002656335 +0000 UTC m=+65.730710039 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert") pod "ingress-canary-xdwfc" (UID: "adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d") : secret "canary-serving-cert" not found Apr 16 13:58:15.987017 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:15.986985 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w7c7s" Apr 16 13:58:22.012983 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.012938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:58:22.013495 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.013104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:58:22.013495 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:22.013110 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:58:22.013495 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:22.013175 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls podName:69050bd1-ea1f-49d8-8ca8-617d41938670 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:54.013158302 +0000 UTC m=+97.741212005 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls") pod "dns-default-6fksw" (UID: "69050bd1-ea1f-49d8-8ca8-617d41938670") : secret "dns-default-metrics-tls" not found Apr 16 13:58:22.013495 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:22.013225 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:58:22.013495 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:22.013276 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert podName:adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d nodeName:}" failed. No retries permitted until 2026-04-16 13:58:54.013263082 +0000 UTC m=+97.741316784 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert") pod "ingress-canary-xdwfc" (UID: "adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d") : secret "canary-serving-cert" not found Apr 16 13:58:22.308956 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.308847 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j"] Apr 16 13:58:22.330974 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.330941 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j"] Apr 16 13:58:22.331137 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.331054 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j" Apr 16 13:58:22.334167 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.334137 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-dm77q\"" Apr 16 13:58:22.334319 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.334168 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 13:58:22.334319 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.334182 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 13:58:22.334319 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.334174 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 13:58:22.334319 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.334137 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 13:58:22.337432 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.337178 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n"] Apr 16 13:58:22.348216 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.348196 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.351335 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.351319 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 13:58:22.351451 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.351433 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 13:58:22.351676 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.351662 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 13:58:22.351952 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.351939 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 13:58:22.358775 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.358751 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n"] Apr 16 13:58:22.415133 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.415099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.415133 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.415131 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e197c3c6-619a-465a-9a4f-dde06c9947a9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j\" (UID: \"e197c3c6-619a-465a-9a4f-dde06c9947a9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j" Apr 16 13:58:22.415344 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.415192 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-ca\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.415344 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.415216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-hub\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.415344 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.415233 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/43b7960d-42b3-4eb4-8770-162bf7a1bd87-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.415344 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.415259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2dmq\" (UniqueName: \"kubernetes.io/projected/e197c3c6-619a-465a-9a4f-dde06c9947a9-kube-api-access-l2dmq\") pod \"managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j\" (UID: \"e197c3c6-619a-465a-9a4f-dde06c9947a9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j" Apr 16 13:58:22.415344 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.415296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwqxz\" (UniqueName: \"kubernetes.io/projected/43b7960d-42b3-4eb4-8770-162bf7a1bd87-kube-api-access-kwqxz\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.415344 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.415323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.516584 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.516552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwqxz\" (UniqueName: \"kubernetes.io/projected/43b7960d-42b3-4eb4-8770-162bf7a1bd87-kube-api-access-kwqxz\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.516584 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.516589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.516795 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.516626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.516795 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.516655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e197c3c6-619a-465a-9a4f-dde06c9947a9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j\" (UID: \"e197c3c6-619a-465a-9a4f-dde06c9947a9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j" Apr 16 13:58:22.516795 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.516685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:58:22.516795 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.516751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-ca\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.516795 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.516785 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-hub\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.517055 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.516806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/43b7960d-42b3-4eb4-8770-162bf7a1bd87-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.517055 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.516835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2dmq\" (UniqueName: \"kubernetes.io/projected/e197c3c6-619a-465a-9a4f-dde06c9947a9-kube-api-access-l2dmq\") pod \"managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j\" (UID: \"e197c3c6-619a-465a-9a4f-dde06c9947a9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j" Apr 16 13:58:22.517746 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.517718 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/43b7960d-42b3-4eb4-8770-162bf7a1bd87-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.519287 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.519259 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:58:22.520320 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.520298 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.520424 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.520353 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-ca\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.520424 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.520415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-hub\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.520531 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.520419 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/43b7960d-42b3-4eb4-8770-162bf7a1bd87-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.524956 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.524934 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e197c3c6-619a-465a-9a4f-dde06c9947a9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j\" (UID: \"e197c3c6-619a-465a-9a4f-dde06c9947a9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j" Apr 16 13:58:22.525267 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.525252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2dmq\" (UniqueName: \"kubernetes.io/projected/e197c3c6-619a-465a-9a4f-dde06c9947a9-kube-api-access-l2dmq\") pod \"managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j\" (UID: \"e197c3c6-619a-465a-9a4f-dde06c9947a9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j" Apr 16 13:58:22.527146 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:22.527126 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:58:22.527246 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:22.527189 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs podName:9ab97e35-4539-405d-bb91-f30c906963c2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:26.527170056 +0000 UTC m=+130.255223759 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs") pod "network-metrics-daemon-l5zhh" (UID: "9ab97e35-4539-405d-bb91-f30c906963c2") : secret "metrics-daemon-secret" not found Apr 16 13:58:22.529512 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.529490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwqxz\" (UniqueName: \"kubernetes.io/projected/43b7960d-42b3-4eb4-8770-162bf7a1bd87-kube-api-access-kwqxz\") pod \"cluster-proxy-proxy-agent-645c6d6557-jzr5n\" (UID: \"43b7960d-42b3-4eb4-8770-162bf7a1bd87\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.617632 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.617545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br4fr\" (UniqueName: \"kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr\") pod \"network-check-target-5zjkj\" (UID: \"6e53c142-6f4e-4358-a390-6d3c43558ef6\") " pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:58:22.620157 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.620138 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:58:22.629830 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.629802 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:58:22.641007 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.640979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-br4fr\" (UniqueName: \"kubernetes.io/projected/6e53c142-6f4e-4358-a390-6d3c43558ef6-kube-api-access-br4fr\") pod \"network-check-target-5zjkj\" (UID: \"6e53c142-6f4e-4358-a390-6d3c43558ef6\") " pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:58:22.646736 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.646706 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j" Apr 16 13:58:22.656651 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.656621 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 13:58:22.765667 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.765455 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7n6t4\"" Apr 16 13:58:22.773634 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.773613 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:58:22.839718 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.836622 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j"] Apr 16 13:58:22.858782 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.853431 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n"] Apr 16 13:58:22.862766 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:58:22.862727 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode197c3c6_619a_465a_9a4f_dde06c9947a9.slice/crio-659906aee78b3976068896dd4af4ce13e88e44978e4de304d472e8f0dcfa9550 WatchSource:0}: Error finding container 659906aee78b3976068896dd4af4ce13e88e44978e4de304d472e8f0dcfa9550: Status 404 returned error can't find the container with id 659906aee78b3976068896dd4af4ce13e88e44978e4de304d472e8f0dcfa9550 Apr 16 13:58:22.863826 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:58:22.863802 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43b7960d_42b3_4eb4_8770_162bf7a1bd87.slice/crio-af9ef0dd3bae4d36178a1e7cf00a7a9c54787ddffb1d1e90fbd14cdef708dfb4 WatchSource:0}: Error finding container af9ef0dd3bae4d36178a1e7cf00a7a9c54787ddffb1d1e90fbd14cdef708dfb4: Status 404 returned error can't find the container with id af9ef0dd3bae4d36178a1e7cf00a7a9c54787ddffb1d1e90fbd14cdef708dfb4 Apr 16 13:58:22.921270 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:22.921241 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5zjkj"] Apr 16 13:58:22.924700 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:58:22.924675 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e53c142_6f4e_4358_a390_6d3c43558ef6.slice/crio-5cf21d06f0d5803faa0b099021b85155434f341fb834dfa5c262b5460c02bbfe WatchSource:0}: Error finding container 5cf21d06f0d5803faa0b099021b85155434f341fb834dfa5c262b5460c02bbfe: Status 404 returned error can't find the container with id 5cf21d06f0d5803faa0b099021b85155434f341fb834dfa5c262b5460c02bbfe Apr 16 13:58:23.056800 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:23.056764 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5zjkj" event={"ID":"6e53c142-6f4e-4358-a390-6d3c43558ef6","Type":"ContainerStarted","Data":"5cf21d06f0d5803faa0b099021b85155434f341fb834dfa5c262b5460c02bbfe"} Apr 16 13:58:23.057647 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:23.057628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j" event={"ID":"e197c3c6-619a-465a-9a4f-dde06c9947a9","Type":"ContainerStarted","Data":"659906aee78b3976068896dd4af4ce13e88e44978e4de304d472e8f0dcfa9550"} Apr 16 13:58:23.058537 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:23.058521 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" event={"ID":"43b7960d-42b3-4eb4-8770-162bf7a1bd87","Type":"ContainerStarted","Data":"af9ef0dd3bae4d36178a1e7cf00a7a9c54787ddffb1d1e90fbd14cdef708dfb4"} Apr 16 13:58:28.072840 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:28.072798 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5zjkj" event={"ID":"6e53c142-6f4e-4358-a390-6d3c43558ef6","Type":"ContainerStarted","Data":"d792a44bb347847316af570ae8ddbf413c1ee47651edf38d8cacb5820942107e"} Apr 16 13:58:28.073345 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:28.072934 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:58:28.074225 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:28.074198 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j" event={"ID":"e197c3c6-619a-465a-9a4f-dde06c9947a9","Type":"ContainerStarted","Data":"0b985af6944ce1e99aab3c81d16cc6064e483786926817c1c608e19307a73740"} Apr 16 13:58:28.075359 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:28.075337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" event={"ID":"43b7960d-42b3-4eb4-8770-162bf7a1bd87","Type":"ContainerStarted","Data":"e2c2921724ac9277a38756860b6615204bcdb6cc5f1c61a694e33cf84b55f6a7"} Apr 16 13:58:28.087001 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:28.086952 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5zjkj" podStartSLOduration=67.120557758 podStartE2EDuration="1m12.086938086s" podCreationTimestamp="2026-04-16 13:57:16 +0000 UTC" firstStartedPulling="2026-04-16 13:58:22.926491447 +0000 UTC m=+66.654545150" lastFinishedPulling="2026-04-16 13:58:27.892871774 +0000 UTC m=+71.620925478" observedRunningTime="2026-04-16 13:58:28.086649214 +0000 UTC m=+71.814702936" watchObservedRunningTime="2026-04-16 13:58:28.086938086 +0000 UTC m=+71.814991810" Apr 16 13:58:28.100794 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:28.100738 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7fb8c86bc7-8lx8j" podStartSLOduration=1.077754368 podStartE2EDuration="6.100718778s" podCreationTimestamp="2026-04-16 13:58:22 +0000 UTC" firstStartedPulling="2026-04-16 13:58:22.864614268 +0000 UTC m=+66.592667978" lastFinishedPulling="2026-04-16 13:58:27.887578676 +0000 UTC m=+71.615632388" observedRunningTime="2026-04-16 13:58:28.100248121 +0000 UTC m=+71.828301862" watchObservedRunningTime="2026-04-16 13:58:28.100718778 +0000 UTC m=+71.828772504" Apr 16 13:58:34.089865 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:34.089822 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" event={"ID":"43b7960d-42b3-4eb4-8770-162bf7a1bd87","Type":"ContainerStarted","Data":"4431a1f84870d3ab63eb0b5e6fdc4a978436535249ce3f470220f39a62d417a2"} Apr 16 13:58:34.089865 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:34.089863 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" event={"ID":"43b7960d-42b3-4eb4-8770-162bf7a1bd87","Type":"ContainerStarted","Data":"41b0251ff0baf2849f432895041230328a75079ed86c91e655d67aee38c01a8b"} Apr 16 13:58:34.107689 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:34.107638 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" podStartSLOduration=1.442695501 podStartE2EDuration="12.107623693s" podCreationTimestamp="2026-04-16 13:58:22 +0000 UTC" firstStartedPulling="2026-04-16 13:58:22.86608281 +0000 UTC m=+66.594136513" lastFinishedPulling="2026-04-16 13:58:33.531010999 +0000 UTC m=+77.259064705" observedRunningTime="2026-04-16 13:58:34.106966259 +0000 UTC m=+77.835019983" watchObservedRunningTime="2026-04-16 13:58:34.107623693 +0000 UTC m=+77.835677408" Apr 16 13:58:54.049628 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:54.049590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:58:54.050047 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:54.049659 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:58:54.050047 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:54.049744 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:58:54.050047 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:54.049754 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:58:54.050047 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:54.049858 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert podName:adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:58.049838833 +0000 UTC m=+161.777892535 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert") pod "ingress-canary-xdwfc" (UID: "adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d") : secret "canary-serving-cert" not found Apr 16 13:58:54.050047 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:58:54.049924 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls podName:69050bd1-ea1f-49d8-8ca8-617d41938670 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:58.049889933 +0000 UTC m=+161.777943635 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls") pod "dns-default-6fksw" (UID: "69050bd1-ea1f-49d8-8ca8-617d41938670") : secret "dns-default-metrics-tls" not found Apr 16 13:58:59.080082 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:58:59.080054 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5zjkj" Apr 16 13:59:26.573729 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:26.573675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 13:59:26.574268 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:59:26.573823 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:59:26.574268 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:59:26.573886 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs podName:9ab97e35-4539-405d-bb91-f30c906963c2 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:28.57386715 +0000 UTC m=+252.301920870 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs") pod "network-metrics-daemon-l5zhh" (UID: "9ab97e35-4539-405d-bb91-f30c906963c2") : secret "metrics-daemon-secret" not found Apr 16 13:59:50.082905 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:50.082876 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h7pm6_af33c15c-6386-4fe7-9155-a5c6b6e05ec4/dns-node-resolver/0.log" Apr 16 13:59:51.281775 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:51.281748 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9tz2x_03875a5c-704e-42af-9ea6-ba5a9f181d94/node-ca/0.log" Apr 16 13:59:53.193729 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:59:53.193686 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6fksw" podUID="69050bd1-ea1f-49d8-8ca8-617d41938670" Apr 16 13:59:53.201813 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:59:53.201786 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-xdwfc" podUID="adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d" Apr 16 13:59:53.266857 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:53.266830 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6fksw" Apr 16 13:59:53.266960 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:53.266830 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:59:54.868153 ip-10-0-131-61 kubenswrapper[2575]: E0416 13:59:54.868116 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-l5zhh" podUID="9ab97e35-4539-405d-bb91-f30c906963c2" Apr 16 13:59:58.099830 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:58.099787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:59:58.100423 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:58.099845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:59:58.102137 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:58.102113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69050bd1-ea1f-49d8-8ca8-617d41938670-metrics-tls\") pod \"dns-default-6fksw\" (UID: \"69050bd1-ea1f-49d8-8ca8-617d41938670\") " pod="openshift-dns/dns-default-6fksw" Apr 16 13:59:58.102230 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:58.102198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d-cert\") pod \"ingress-canary-xdwfc\" (UID: \"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d\") " pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:59:58.375132 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:58.375049 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ftzrr\"" Apr 16 13:59:58.375830 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:58.375812 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mv45m\"" Apr 16 13:59:58.378600 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:58.378581 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6fksw" Apr 16 13:59:58.378734 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:58.378630 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xdwfc" Apr 16 13:59:58.508287 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:58.508244 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xdwfc"] Apr 16 13:59:58.512641 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:59:58.512596 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf72fd1_6177_4cfa_a1c8_0b8bb22dba4d.slice/crio-05dfab9b2d04a605e19dfc58f4110342aa873bd0688116682df412f407c2e144 WatchSource:0}: Error finding container 05dfab9b2d04a605e19dfc58f4110342aa873bd0688116682df412f407c2e144: Status 404 returned error can't find the container with id 05dfab9b2d04a605e19dfc58f4110342aa873bd0688116682df412f407c2e144 Apr 16 13:59:58.521677 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:58.521652 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6fksw"] Apr 16 13:59:58.525572 ip-10-0-131-61 kubenswrapper[2575]: W0416 13:59:58.525544 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69050bd1_ea1f_49d8_8ca8_617d41938670.slice/crio-5323060e3fc35d50ba1d34dcf1066515a8a862cadc587ef0c4c541bfae34d6fe WatchSource:0}: Error finding container 5323060e3fc35d50ba1d34dcf1066515a8a862cadc587ef0c4c541bfae34d6fe: Status 404 returned error can't find the container with id 5323060e3fc35d50ba1d34dcf1066515a8a862cadc587ef0c4c541bfae34d6fe Apr 16 13:59:59.281199 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:59.281113 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xdwfc" event={"ID":"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d","Type":"ContainerStarted","Data":"05dfab9b2d04a605e19dfc58f4110342aa873bd0688116682df412f407c2e144"} Apr 16 13:59:59.282255 ip-10-0-131-61 kubenswrapper[2575]: I0416 13:59:59.282217 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6fksw" event={"ID":"69050bd1-ea1f-49d8-8ca8-617d41938670","Type":"ContainerStarted","Data":"5323060e3fc35d50ba1d34dcf1066515a8a862cadc587ef0c4c541bfae34d6fe"} Apr 16 14:00:00.286774 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:00.286730 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6fksw" event={"ID":"69050bd1-ea1f-49d8-8ca8-617d41938670","Type":"ContainerStarted","Data":"e7e1d30e9128c52232c94049112bfc3f272fa9ae85976f11c70546deee39f12e"} Apr 16 14:00:01.290823 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:01.290786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6fksw" event={"ID":"69050bd1-ea1f-49d8-8ca8-617d41938670","Type":"ContainerStarted","Data":"3c78ed2cdf48cd18ec1087a153581a31f06406f8ea9c4c2d86ff6532d4ca484e"} Apr 16 14:00:01.291288 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:01.290911 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6fksw" Apr 16 14:00:01.291989 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:01.291968 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xdwfc" event={"ID":"adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d","Type":"ContainerStarted","Data":"bc502d10d20c04381a8fb7791391c8cda5eb39db4b5ba5ec3e91e95832574651"} Apr 16 14:00:01.311399 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:01.311350 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6fksw" podStartSLOduration=129.946407034 podStartE2EDuration="2m11.311334393s" podCreationTimestamp="2026-04-16 13:57:50 +0000 UTC" firstStartedPulling="2026-04-16 13:59:58.527912258 +0000 UTC m=+162.255965977" lastFinishedPulling="2026-04-16 13:59:59.892839629 +0000 UTC m=+163.620893336" observedRunningTime="2026-04-16 14:00:01.311142179 +0000 UTC m=+165.039195916" watchObservedRunningTime="2026-04-16 14:00:01.311334393 +0000 UTC m=+165.039388118" Apr 16 14:00:01.331425 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:01.331379 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xdwfc" podStartSLOduration=129.320985289 podStartE2EDuration="2m11.331361998s" podCreationTimestamp="2026-04-16 13:57:50 +0000 UTC" firstStartedPulling="2026-04-16 13:59:58.514723071 +0000 UTC m=+162.242776777" lastFinishedPulling="2026-04-16 14:00:00.525099778 +0000 UTC m=+164.253153486" observedRunningTime="2026-04-16 14:00:01.331287294 +0000 UTC m=+165.059341033" watchObservedRunningTime="2026-04-16 14:00:01.331361998 +0000 UTC m=+165.059415722" Apr 16 14:00:08.850411 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:08.850377 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 14:00:11.298194 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.298167 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6fksw" Apr 16 14:00:11.367444 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.367405 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-567fc49968-7hrgm"] Apr 16 14:00:11.370963 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.370940 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.374754 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.374732 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:00:11.374754 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.374745 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:00:11.375705 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.375674 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qt6qg\"" Apr 16 14:00:11.375944 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.375930 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:00:11.379183 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.379165 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-slzqv"] Apr 16 14:00:11.383169 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.383150 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.386644 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.386616 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:00:11.386822 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.386807 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:00:11.386982 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.386877 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:00:11.387092 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.386928 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vbnth\"" Apr 16 14:00:11.387418 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.387243 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:00:11.387952 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.387934 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-567fc49968-7hrgm"] Apr 16 14:00:11.388909 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.388870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b80ca789-264f-4eb3-8fea-ea24dbc639cd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.389020 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.388924 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c6d6\" (UniqueName: \"kubernetes.io/projected/b80ca789-264f-4eb3-8fea-ea24dbc639cd-kube-api-access-9c6d6\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.389020 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.388950 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-ca-trust-extracted\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.389020 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.389014 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-registry-certificates\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.389180 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.389037 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tvhf\" (UniqueName: \"kubernetes.io/projected/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-kube-api-access-5tvhf\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.389180 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.389160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-image-registry-private-configuration\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.389312 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.389199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b80ca789-264f-4eb3-8fea-ea24dbc639cd-crio-socket\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.389312 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.389227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b80ca789-264f-4eb3-8fea-ea24dbc639cd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.389312 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.389254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-registry-tls\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.389466 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.389326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-bound-sa-token\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.389466 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.389354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b80ca789-264f-4eb3-8fea-ea24dbc639cd-data-volume\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.389466 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.389382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-installation-pull-secrets\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.389466 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.389410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-trusted-ca\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.389714 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.389636 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:00:11.428952 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.428920 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-slzqv"] Apr 16 14:00:11.489869 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.489835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b80ca789-264f-4eb3-8fea-ea24dbc639cd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.489869 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.489872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c6d6\" (UniqueName: \"kubernetes.io/projected/b80ca789-264f-4eb3-8fea-ea24dbc639cd-kube-api-access-9c6d6\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.490100 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.489907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-ca-trust-extracted\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.490138 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490118 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-registry-certificates\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.490183 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tvhf\" (UniqueName: \"kubernetes.io/projected/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-kube-api-access-5tvhf\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.490262 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-image-registry-private-configuration\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.490300 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b80ca789-264f-4eb3-8fea-ea24dbc639cd-crio-socket\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.490341 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-ca-trust-extracted\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.490341 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b80ca789-264f-4eb3-8fea-ea24dbc639cd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.490431 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-registry-tls\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.490431 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-bound-sa-token\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.490678 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490414 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b80ca789-264f-4eb3-8fea-ea24dbc639cd-data-volume\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.490780 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-installation-pull-secrets\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.490780 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-trusted-ca\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.490910 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b80ca789-264f-4eb3-8fea-ea24dbc639cd-data-volume\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.490910 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b80ca789-264f-4eb3-8fea-ea24dbc639cd-crio-socket\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.490910 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490527 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b80ca789-264f-4eb3-8fea-ea24dbc639cd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.491046 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.490944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-registry-certificates\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.492169 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.492142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-trusted-ca\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.492859 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.492838 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b80ca789-264f-4eb3-8fea-ea24dbc639cd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.493050 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.493035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-image-registry-private-configuration\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.493097 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.493072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-installation-pull-secrets\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.498614 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.498593 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-bound-sa-token\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.499184 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.499167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tvhf\" (UniqueName: \"kubernetes.io/projected/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-kube-api-access-5tvhf\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.499508 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.499488 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c6d6\" (UniqueName: \"kubernetes.io/projected/b80ca789-264f-4eb3-8fea-ea24dbc639cd-kube-api-access-9c6d6\") pod \"insights-runtime-extractor-slzqv\" (UID: \"b80ca789-264f-4eb3-8fea-ea24dbc639cd\") " pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.500218 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.500202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6cc8cd19-467f-45a6-aedd-568b2ea53b3d-registry-tls\") pod \"image-registry-567fc49968-7hrgm\" (UID: \"6cc8cd19-467f-45a6-aedd-568b2ea53b3d\") " pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.681967 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.681845 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:11.692074 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.692052 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-slzqv" Apr 16 14:00:11.818960 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.818928 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-567fc49968-7hrgm"] Apr 16 14:00:11.822233 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:00:11.822207 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cc8cd19_467f_45a6_aedd_568b2ea53b3d.slice/crio-b7b43dd1f70774543fe9a7a7741c14fc45b7e62c9e003ef07cfb740ae1c472b9 WatchSource:0}: Error finding container b7b43dd1f70774543fe9a7a7741c14fc45b7e62c9e003ef07cfb740ae1c472b9: Status 404 returned error can't find the container with id b7b43dd1f70774543fe9a7a7741c14fc45b7e62c9e003ef07cfb740ae1c472b9 Apr 16 14:00:11.832539 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:11.832516 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-slzqv"] Apr 16 14:00:11.835112 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:00:11.835081 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb80ca789_264f_4eb3_8fea_ea24dbc639cd.slice/crio-bc25627a0bc98eccd5ca4516ac4f9004486ca004715c4642495f820a73721fbe WatchSource:0}: Error finding container bc25627a0bc98eccd5ca4516ac4f9004486ca004715c4642495f820a73721fbe: Status 404 returned error can't find the container with id bc25627a0bc98eccd5ca4516ac4f9004486ca004715c4642495f820a73721fbe Apr 16 14:00:12.325968 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:12.325925 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-567fc49968-7hrgm" event={"ID":"6cc8cd19-467f-45a6-aedd-568b2ea53b3d","Type":"ContainerStarted","Data":"8d2c6ac598f96426732bc218b622db9be4ffc82f6e2e851ffaa4347440a665e0"} Apr 16 14:00:12.325968 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:12.325977 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-567fc49968-7hrgm" event={"ID":"6cc8cd19-467f-45a6-aedd-568b2ea53b3d","Type":"ContainerStarted","Data":"b7b43dd1f70774543fe9a7a7741c14fc45b7e62c9e003ef07cfb740ae1c472b9"} Apr 16 14:00:12.326464 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:12.326051 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:12.327224 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:12.327203 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-slzqv" event={"ID":"b80ca789-264f-4eb3-8fea-ea24dbc639cd","Type":"ContainerStarted","Data":"ea6508ed7514242cc8a3c5559683bbcabb16d075877006ba132e26ce04391bfd"} Apr 16 14:00:12.327308 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:12.327229 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-slzqv" event={"ID":"b80ca789-264f-4eb3-8fea-ea24dbc639cd","Type":"ContainerStarted","Data":"bc25627a0bc98eccd5ca4516ac4f9004486ca004715c4642495f820a73721fbe"} Apr 16 14:00:13.331151 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:13.331111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-slzqv" event={"ID":"b80ca789-264f-4eb3-8fea-ea24dbc639cd","Type":"ContainerStarted","Data":"e64c7182e3848e7f85929eecfdbaaa12959b806d583edff11420d3a595b6ae4b"} Apr 16 14:00:16.340199 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:16.340167 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-slzqv" event={"ID":"b80ca789-264f-4eb3-8fea-ea24dbc639cd","Type":"ContainerStarted","Data":"adecff7318afc8fcdb0e1424421fcdbd79cd811937b785dc7070feee211dea75"} Apr 16 14:00:16.357163 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:16.357113 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-567fc49968-7hrgm" podStartSLOduration=5.357096514 podStartE2EDuration="5.357096514s" podCreationTimestamp="2026-04-16 14:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:00:12.362926331 +0000 UTC m=+176.090980048" watchObservedRunningTime="2026-04-16 14:00:16.357096514 +0000 UTC m=+180.085150236" Apr 16 14:00:16.357862 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:16.357832 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-slzqv" podStartSLOduration=1.642559051 podStartE2EDuration="5.357823469s" podCreationTimestamp="2026-04-16 14:00:11 +0000 UTC" firstStartedPulling="2026-04-16 14:00:11.878978395 +0000 UTC m=+175.607032114" lastFinishedPulling="2026-04-16 14:00:15.594242826 +0000 UTC m=+179.322296532" observedRunningTime="2026-04-16 14:00:16.356674384 +0000 UTC m=+180.084728110" watchObservedRunningTime="2026-04-16 14:00:16.357823469 +0000 UTC m=+180.085877193" Apr 16 14:00:25.987209 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:25.987171 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ppk6r"] Apr 16 14:00:25.992113 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:25.992089 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:25.994661 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:25.994634 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:00:25.995021 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:25.995007 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:00:25.995077 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:25.995045 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:00:25.995194 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:25.995175 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:00:25.995837 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:25.995819 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2gbx5\"" Apr 16 14:00:25.995954 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:25.995920 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:00:25.995954 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:25.995927 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:00:26.087591 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.087551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-wtmp\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.087591 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.087589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1de51b5e-b280-4669-ae37-f4318fdfda79-root\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.087847 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.087611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-textfile\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.087847 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.087633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-accelerators-collector-config\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.087847 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.087663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxnr8\" (UniqueName: \"kubernetes.io/projected/1de51b5e-b280-4669-ae37-f4318fdfda79-kube-api-access-vxnr8\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.087847 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.087694 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1de51b5e-b280-4669-ae37-f4318fdfda79-sys\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.087847 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.087757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.087847 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.087808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-tls\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.087847 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.087834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1de51b5e-b280-4669-ae37-f4318fdfda79-metrics-client-ca\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.188866 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.188833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-wtmp\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189052 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.188873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1de51b5e-b280-4669-ae37-f4318fdfda79-root\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189052 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.188905 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-textfile\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189052 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.188928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-accelerators-collector-config\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189052 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.188950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxnr8\" (UniqueName: \"kubernetes.io/projected/1de51b5e-b280-4669-ae37-f4318fdfda79-kube-api-access-vxnr8\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189052 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.188971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1de51b5e-b280-4669-ae37-f4318fdfda79-sys\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189052 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.188977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1de51b5e-b280-4669-ae37-f4318fdfda79-root\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189052 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.189001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189052 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.189050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-wtmp\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189374 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.189087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-tls\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189374 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.189112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1de51b5e-b280-4669-ae37-f4318fdfda79-sys\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189374 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.189122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1de51b5e-b280-4669-ae37-f4318fdfda79-metrics-client-ca\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189374 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.189286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-textfile\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.189596 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.189578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-accelerators-collector-config\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.190226 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.190203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1de51b5e-b280-4669-ae37-f4318fdfda79-metrics-client-ca\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.191381 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.191348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.191478 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.191424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1de51b5e-b280-4669-ae37-f4318fdfda79-node-exporter-tls\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.196244 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.196215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxnr8\" (UniqueName: \"kubernetes.io/projected/1de51b5e-b280-4669-ae37-f4318fdfda79-kube-api-access-vxnr8\") pod \"node-exporter-ppk6r\" (UID: \"1de51b5e-b280-4669-ae37-f4318fdfda79\") " pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.301565 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.301532 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ppk6r" Apr 16 14:00:26.310484 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:00:26.310454 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1de51b5e_b280_4669_ae37_f4318fdfda79.slice/crio-50b8f9fce1cd2fab39e7a50ad6f0c0142d6cc7a9019f196e09de26332cd7d525 WatchSource:0}: Error finding container 50b8f9fce1cd2fab39e7a50ad6f0c0142d6cc7a9019f196e09de26332cd7d525: Status 404 returned error can't find the container with id 50b8f9fce1cd2fab39e7a50ad6f0c0142d6cc7a9019f196e09de26332cd7d525 Apr 16 14:00:26.367072 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:26.367032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ppk6r" event={"ID":"1de51b5e-b280-4669-ae37-f4318fdfda79","Type":"ContainerStarted","Data":"50b8f9fce1cd2fab39e7a50ad6f0c0142d6cc7a9019f196e09de26332cd7d525"} Apr 16 14:00:28.373980 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:28.373941 2575 generic.go:358] "Generic (PLEG): container finished" podID="1de51b5e-b280-4669-ae37-f4318fdfda79" containerID="f19f3197b478f5d65bccdc5134b0db3829daa38433acb18ada33edfaf08ae8c1" exitCode=0 Apr 16 14:00:28.373980 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:28.373984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ppk6r" event={"ID":"1de51b5e-b280-4669-ae37-f4318fdfda79","Type":"ContainerDied","Data":"f19f3197b478f5d65bccdc5134b0db3829daa38433acb18ada33edfaf08ae8c1"} Apr 16 14:00:29.378717 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:29.378681 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ppk6r" event={"ID":"1de51b5e-b280-4669-ae37-f4318fdfda79","Type":"ContainerStarted","Data":"be85bed6c7ff4ae400e7d370b308a0ffdb27e48466c09d89f2d43911f5600258"} Apr 16 14:00:29.378717 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:29.378717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ppk6r" event={"ID":"1de51b5e-b280-4669-ae37-f4318fdfda79","Type":"ContainerStarted","Data":"484c70a4322dff192bbb6d556cdaca025bb0ecba55c7eb639a38acb25783a7e7"} Apr 16 14:00:29.398451 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:29.398400 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ppk6r" podStartSLOduration=3.200021691 podStartE2EDuration="4.398386494s" podCreationTimestamp="2026-04-16 14:00:25 +0000 UTC" firstStartedPulling="2026-04-16 14:00:26.312418056 +0000 UTC m=+190.040471759" lastFinishedPulling="2026-04-16 14:00:27.510782844 +0000 UTC m=+191.238836562" observedRunningTime="2026-04-16 14:00:29.396851036 +0000 UTC m=+193.124904762" watchObservedRunningTime="2026-04-16 14:00:29.398386494 +0000 UTC m=+193.126440219" Apr 16 14:00:30.408804 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.408762 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk"] Apr 16 14:00:30.411547 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.411531 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.414130 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.414105 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:00:30.414266 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.414158 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 14:00:30.415269 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.415237 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 14:00:30.415394 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.415269 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-vkxd9\"" Apr 16 14:00:30.415394 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.415333 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-2hnpt0rlmerif\"" Apr 16 14:00:30.415394 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.415337 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 14:00:30.422844 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.422822 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk"] Apr 16 14:00:30.522789 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.522754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3adc34d0-b3d2-4162-a395-be904e0f2cbe-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.522996 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.522813 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw6lw\" (UniqueName: \"kubernetes.io/projected/3adc34d0-b3d2-4162-a395-be904e0f2cbe-kube-api-access-bw6lw\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.522996 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.522847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3adc34d0-b3d2-4162-a395-be904e0f2cbe-secret-metrics-server-client-certs\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.522996 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.522865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3adc34d0-b3d2-4162-a395-be904e0f2cbe-audit-log\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.522996 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.522883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3adc34d0-b3d2-4162-a395-be904e0f2cbe-client-ca-bundle\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.522996 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.522956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3adc34d0-b3d2-4162-a395-be904e0f2cbe-metrics-server-audit-profiles\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.523159 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.523001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3adc34d0-b3d2-4162-a395-be904e0f2cbe-secret-metrics-server-tls\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.623743 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.623712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3adc34d0-b3d2-4162-a395-be904e0f2cbe-secret-metrics-server-tls\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.623850 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.623751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3adc34d0-b3d2-4162-a395-be904e0f2cbe-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.623850 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.623815 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw6lw\" (UniqueName: \"kubernetes.io/projected/3adc34d0-b3d2-4162-a395-be904e0f2cbe-kube-api-access-bw6lw\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.623850 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.623840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3adc34d0-b3d2-4162-a395-be904e0f2cbe-secret-metrics-server-client-certs\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.624026 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.623861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3adc34d0-b3d2-4162-a395-be904e0f2cbe-audit-log\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.624026 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.623907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3adc34d0-b3d2-4162-a395-be904e0f2cbe-client-ca-bundle\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.624026 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.623944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3adc34d0-b3d2-4162-a395-be904e0f2cbe-metrics-server-audit-profiles\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.624345 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.624315 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3adc34d0-b3d2-4162-a395-be904e0f2cbe-audit-log\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.624693 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.624674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3adc34d0-b3d2-4162-a395-be904e0f2cbe-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.625084 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.625059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3adc34d0-b3d2-4162-a395-be904e0f2cbe-metrics-server-audit-profiles\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.626416 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.626393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3adc34d0-b3d2-4162-a395-be904e0f2cbe-secret-metrics-server-tls\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.626564 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.626543 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3adc34d0-b3d2-4162-a395-be904e0f2cbe-client-ca-bundle\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.626608 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.626576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3adc34d0-b3d2-4162-a395-be904e0f2cbe-secret-metrics-server-client-certs\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.632329 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.632296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw6lw\" (UniqueName: \"kubernetes.io/projected/3adc34d0-b3d2-4162-a395-be904e0f2cbe-kube-api-access-bw6lw\") pod \"metrics-server-6dd6c4dfc8-pxgnk\" (UID: \"3adc34d0-b3d2-4162-a395-be904e0f2cbe\") " pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.721002 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.720907 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:30.842049 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:30.842020 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk"] Apr 16 14:00:30.845504 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:00:30.845476 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3adc34d0_b3d2_4162_a395_be904e0f2cbe.slice/crio-67d021fd84e4e3745cc57560e898ddc7ea5736c56f6ba2f6d1499d143af1f80d WatchSource:0}: Error finding container 67d021fd84e4e3745cc57560e898ddc7ea5736c56f6ba2f6d1499d143af1f80d: Status 404 returned error can't find the container with id 67d021fd84e4e3745cc57560e898ddc7ea5736c56f6ba2f6d1499d143af1f80d Apr 16 14:00:31.385273 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:31.385234 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" event={"ID":"3adc34d0-b3d2-4162-a395-be904e0f2cbe","Type":"ContainerStarted","Data":"67d021fd84e4e3745cc57560e898ddc7ea5736c56f6ba2f6d1499d143af1f80d"} Apr 16 14:00:33.335876 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:33.335841 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-567fc49968-7hrgm" Apr 16 14:00:34.395603 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:34.395521 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" event={"ID":"3adc34d0-b3d2-4162-a395-be904e0f2cbe","Type":"ContainerStarted","Data":"698617095faf5c4b512c3a849616643adfa5cab20b9a809336c1226b91c362b6"} Apr 16 14:00:34.438284 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:34.438232 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" podStartSLOduration=1.146822316 podStartE2EDuration="4.438217066s" podCreationTimestamp="2026-04-16 14:00:30 +0000 UTC" firstStartedPulling="2026-04-16 14:00:30.847422151 +0000 UTC m=+194.575475854" lastFinishedPulling="2026-04-16 14:00:34.138816887 +0000 UTC m=+197.866870604" observedRunningTime="2026-04-16 14:00:34.431249748 +0000 UTC m=+198.159303473" watchObservedRunningTime="2026-04-16 14:00:34.438217066 +0000 UTC m=+198.166270791" Apr 16 14:00:50.722038 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:50.722003 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:00:50.722038 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:00:50.722048 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:01:01.335171 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:01.335135 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6dd6c4dfc8-pxgnk_3adc34d0-b3d2-4162-a395-be904e0f2cbe/metrics-server/0.log" Apr 16 14:01:02.935632 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:02.935605 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ppk6r_1de51b5e-b280-4669-ae37-f4318fdfda79/init-textfile/0.log" Apr 16 14:01:03.136329 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:03.136303 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ppk6r_1de51b5e-b280-4669-ae37-f4318fdfda79/node-exporter/0.log" Apr 16 14:01:03.335281 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:03.335250 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ppk6r_1de51b5e-b280-4669-ae37-f4318fdfda79/kube-rbac-proxy/0.log" Apr 16 14:01:08.535252 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:08.535220 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6fksw_69050bd1-ea1f-49d8-8ca8-617d41938670/dns/0.log" Apr 16 14:01:08.735989 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:08.735956 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6fksw_69050bd1-ea1f-49d8-8ca8-617d41938670/kube-rbac-proxy/0.log" Apr 16 14:01:09.735567 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:09.735545 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h7pm6_af33c15c-6386-4fe7-9155-a5c6b6e05ec4/dns-node-resolver/0.log" Apr 16 14:01:10.728470 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:10.728441 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:01:10.733450 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:10.733427 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6dd6c4dfc8-pxgnk" Apr 16 14:01:12.657599 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:12.657528 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" podUID="43b7960d-42b3-4eb4-8770-162bf7a1bd87" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:22.658064 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:22.658024 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" podUID="43b7960d-42b3-4eb4-8770-162bf7a1bd87" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:28.575453 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:28.575399 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 14:01:28.577840 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:28.577814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9ab97e35-4539-405d-bb91-f30c906963c2-metrics-certs\") pod \"network-metrics-daemon-l5zhh\" (UID: \"9ab97e35-4539-405d-bb91-f30c906963c2\") " pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 14:01:28.654121 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:28.654089 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jz268\"" Apr 16 14:01:28.662249 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:28.662228 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l5zhh" Apr 16 14:01:28.783493 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:28.783461 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l5zhh"] Apr 16 14:01:28.786528 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:01:28.786501 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab97e35_4539_405d_bb91_f30c906963c2.slice/crio-bca096a39161a84226e654e3de7d7486d80882638446ccb1bffeb50d20f4b382 WatchSource:0}: Error finding container bca096a39161a84226e654e3de7d7486d80882638446ccb1bffeb50d20f4b382: Status 404 returned error can't find the container with id bca096a39161a84226e654e3de7d7486d80882638446ccb1bffeb50d20f4b382 Apr 16 14:01:29.536119 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:29.536023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l5zhh" event={"ID":"9ab97e35-4539-405d-bb91-f30c906963c2","Type":"ContainerStarted","Data":"bca096a39161a84226e654e3de7d7486d80882638446ccb1bffeb50d20f4b382"} Apr 16 14:01:30.540212 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:30.540174 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l5zhh" event={"ID":"9ab97e35-4539-405d-bb91-f30c906963c2","Type":"ContainerStarted","Data":"a6546f93996bfc489bfd59d004fded4b4878a452c041269b3acd991005edf3f0"} Apr 16 14:01:30.540212 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:30.540216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l5zhh" event={"ID":"9ab97e35-4539-405d-bb91-f30c906963c2","Type":"ContainerStarted","Data":"c33ac5882aa9742703d493d560f4c6671189211edd04b7bc12fb86b2200d1432"} Apr 16 14:01:30.556134 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:30.556051 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l5zhh" podStartSLOduration=253.428370045 podStartE2EDuration="4m14.556031803s" podCreationTimestamp="2026-04-16 13:57:16 +0000 UTC" firstStartedPulling="2026-04-16 14:01:28.788395545 +0000 UTC m=+252.516449249" lastFinishedPulling="2026-04-16 14:01:29.916057305 +0000 UTC m=+253.644111007" observedRunningTime="2026-04-16 14:01:30.55546123 +0000 UTC m=+254.283514955" watchObservedRunningTime="2026-04-16 14:01:30.556031803 +0000 UTC m=+254.284085529" Apr 16 14:01:32.657812 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:32.657768 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" podUID="43b7960d-42b3-4eb4-8770-162bf7a1bd87" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:32.658225 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:32.657833 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" Apr 16 14:01:32.658328 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:32.658300 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"4431a1f84870d3ab63eb0b5e6fdc4a978436535249ce3f470220f39a62d417a2"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 14:01:32.658365 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:32.658348 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" podUID="43b7960d-42b3-4eb4-8770-162bf7a1bd87" containerName="service-proxy" containerID="cri-o://4431a1f84870d3ab63eb0b5e6fdc4a978436535249ce3f470220f39a62d417a2" gracePeriod=30 Apr 16 14:01:33.551228 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:33.551189 2575 generic.go:358] "Generic (PLEG): container finished" podID="43b7960d-42b3-4eb4-8770-162bf7a1bd87" containerID="4431a1f84870d3ab63eb0b5e6fdc4a978436535249ce3f470220f39a62d417a2" exitCode=2 Apr 16 14:01:33.551396 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:33.551257 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" event={"ID":"43b7960d-42b3-4eb4-8770-162bf7a1bd87","Type":"ContainerDied","Data":"4431a1f84870d3ab63eb0b5e6fdc4a978436535249ce3f470220f39a62d417a2"} Apr 16 14:01:33.551396 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:01:33.551291 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-645c6d6557-jzr5n" event={"ID":"43b7960d-42b3-4eb4-8770-162bf7a1bd87","Type":"ContainerStarted","Data":"124217f0b4e3770186400149f95625b03fd37a5cb5b9ab3189c451b5f250bc68"} Apr 16 14:02:16.744992 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:02:16.744960 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 14:02:16.745557 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:02:16.745283 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 14:02:16.747933 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:02:16.747915 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:03:59.179078 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.179043 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4xds7"] Apr 16 14:03:59.181116 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.181098 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xds7" Apr 16 14:03:59.183186 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.183165 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:03:59.189009 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.188985 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4xds7"] Apr 16 14:03:59.236655 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.236612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f5b00d31-4f15-45a9-8cf4-a8506d766be0-dbus\") pod \"global-pull-secret-syncer-4xds7\" (UID: \"f5b00d31-4f15-45a9-8cf4-a8506d766be0\") " pod="kube-system/global-pull-secret-syncer-4xds7" Apr 16 14:03:59.236655 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.236656 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5b00d31-4f15-45a9-8cf4-a8506d766be0-original-pull-secret\") pod \"global-pull-secret-syncer-4xds7\" (UID: \"f5b00d31-4f15-45a9-8cf4-a8506d766be0\") " pod="kube-system/global-pull-secret-syncer-4xds7" Apr 16 14:03:59.236872 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.236722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f5b00d31-4f15-45a9-8cf4-a8506d766be0-kubelet-config\") pod \"global-pull-secret-syncer-4xds7\" (UID: \"f5b00d31-4f15-45a9-8cf4-a8506d766be0\") " pod="kube-system/global-pull-secret-syncer-4xds7" Apr 16 14:03:59.338029 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.337990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f5b00d31-4f15-45a9-8cf4-a8506d766be0-kubelet-config\") pod \"global-pull-secret-syncer-4xds7\" (UID: \"f5b00d31-4f15-45a9-8cf4-a8506d766be0\") " pod="kube-system/global-pull-secret-syncer-4xds7" Apr 16 14:03:59.338029 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.338032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f5b00d31-4f15-45a9-8cf4-a8506d766be0-dbus\") pod \"global-pull-secret-syncer-4xds7\" (UID: \"f5b00d31-4f15-45a9-8cf4-a8506d766be0\") " pod="kube-system/global-pull-secret-syncer-4xds7" Apr 16 14:03:59.338263 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.338052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5b00d31-4f15-45a9-8cf4-a8506d766be0-original-pull-secret\") pod \"global-pull-secret-syncer-4xds7\" (UID: \"f5b00d31-4f15-45a9-8cf4-a8506d766be0\") " pod="kube-system/global-pull-secret-syncer-4xds7" Apr 16 14:03:59.338263 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.338131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f5b00d31-4f15-45a9-8cf4-a8506d766be0-kubelet-config\") pod \"global-pull-secret-syncer-4xds7\" (UID: \"f5b00d31-4f15-45a9-8cf4-a8506d766be0\") " pod="kube-system/global-pull-secret-syncer-4xds7" Apr 16 14:03:59.338263 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.338185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f5b00d31-4f15-45a9-8cf4-a8506d766be0-dbus\") pod \"global-pull-secret-syncer-4xds7\" (UID: \"f5b00d31-4f15-45a9-8cf4-a8506d766be0\") " pod="kube-system/global-pull-secret-syncer-4xds7" Apr 16 14:03:59.340390 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.340371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5b00d31-4f15-45a9-8cf4-a8506d766be0-original-pull-secret\") pod \"global-pull-secret-syncer-4xds7\" (UID: \"f5b00d31-4f15-45a9-8cf4-a8506d766be0\") " pod="kube-system/global-pull-secret-syncer-4xds7" Apr 16 14:03:59.490171 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.490076 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xds7" Apr 16 14:03:59.607829 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.607793 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4xds7"] Apr 16 14:03:59.610689 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:03:59.610658 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b00d31_4f15_45a9_8cf4_a8506d766be0.slice/crio-ac24f3c805143c7c562344214d86433bc833b9d48361e39025cfe0b10d7865ff WatchSource:0}: Error finding container ac24f3c805143c7c562344214d86433bc833b9d48361e39025cfe0b10d7865ff: Status 404 returned error can't find the container with id ac24f3c805143c7c562344214d86433bc833b9d48361e39025cfe0b10d7865ff Apr 16 14:03:59.612255 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.612236 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:03:59.920929 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:03:59.920879 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4xds7" event={"ID":"f5b00d31-4f15-45a9-8cf4-a8506d766be0","Type":"ContainerStarted","Data":"ac24f3c805143c7c562344214d86433bc833b9d48361e39025cfe0b10d7865ff"} Apr 16 14:04:03.931929 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:03.931803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4xds7" event={"ID":"f5b00d31-4f15-45a9-8cf4-a8506d766be0","Type":"ContainerStarted","Data":"1d2ea3caaae0ae713af830510ec9bb0ae8a5acf469abad3a8d6b58ecc863e297"} Apr 16 14:04:03.946595 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:03.946536 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4xds7" podStartSLOduration=0.97763288 podStartE2EDuration="4.946514527s" podCreationTimestamp="2026-04-16 14:03:59 +0000 UTC" firstStartedPulling="2026-04-16 14:03:59.612364612 +0000 UTC m=+403.340418315" lastFinishedPulling="2026-04-16 14:04:03.581246256 +0000 UTC m=+407.309299962" observedRunningTime="2026-04-16 14:04:03.946110131 +0000 UTC m=+407.674163855" watchObservedRunningTime="2026-04-16 14:04:03.946514527 +0000 UTC m=+407.674568255" Apr 16 14:04:53.970157 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:53.970122 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm"] Apr 16 14:04:53.973247 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:53.973225 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:04:53.975647 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:53.975618 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wh6lk\"" Apr 16 14:04:53.975778 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:53.975646 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:04:53.976635 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:53.976613 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:04:53.981276 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:53.981254 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm"] Apr 16 14:04:54.037946 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:54.037886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:04:54.038121 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:54.037960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:04:54.038121 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:54.037984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8prm\" (UniqueName: \"kubernetes.io/projected/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-kube-api-access-d8prm\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:04:54.139050 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:54.138999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:04:54.139228 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:54.139071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:04:54.139228 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:54.139109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8prm\" (UniqueName: \"kubernetes.io/projected/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-kube-api-access-d8prm\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:04:54.139458 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:54.139439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:04:54.139494 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:54.139461 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:04:54.147202 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:54.147180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8prm\" (UniqueName: \"kubernetes.io/projected/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-kube-api-access-d8prm\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:04:54.283049 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:54.283009 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:04:54.407031 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:54.406979 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm"] Apr 16 14:04:54.409564 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:04:54.409536 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7d99400_9823_42a2_b53c_2cbeaa2a0e43.slice/crio-c76ad0decf84a12d654d55c9d9a1de0bdad866d05d2c2c67dced54132a8ae6ce WatchSource:0}: Error finding container c76ad0decf84a12d654d55c9d9a1de0bdad866d05d2c2c67dced54132a8ae6ce: Status 404 returned error can't find the container with id c76ad0decf84a12d654d55c9d9a1de0bdad866d05d2c2c67dced54132a8ae6ce Apr 16 14:04:55.067357 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:04:55.067320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" event={"ID":"a7d99400-9823-42a2-b53c-2cbeaa2a0e43","Type":"ContainerStarted","Data":"c76ad0decf84a12d654d55c9d9a1de0bdad866d05d2c2c67dced54132a8ae6ce"} Apr 16 14:05:00.083253 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:00.083214 2575 generic.go:358] "Generic (PLEG): container finished" podID="a7d99400-9823-42a2-b53c-2cbeaa2a0e43" containerID="32d71dd5bb7e2b835e051206235c92caa8da3f54e3e8a112ad0749eb2990c70b" exitCode=0 Apr 16 14:05:00.083658 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:00.083290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" event={"ID":"a7d99400-9823-42a2-b53c-2cbeaa2a0e43","Type":"ContainerDied","Data":"32d71dd5bb7e2b835e051206235c92caa8da3f54e3e8a112ad0749eb2990c70b"} Apr 16 14:05:03.094549 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:03.094516 2575 generic.go:358] "Generic (PLEG): container finished" podID="a7d99400-9823-42a2-b53c-2cbeaa2a0e43" containerID="a9f79aa3e800d8b5f6d90a776a6644df5107eb206f6b6ab6a3496cf1ff94be2b" exitCode=0 Apr 16 14:05:03.094937 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:03.094556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" event={"ID":"a7d99400-9823-42a2-b53c-2cbeaa2a0e43","Type":"ContainerDied","Data":"a9f79aa3e800d8b5f6d90a776a6644df5107eb206f6b6ab6a3496cf1ff94be2b"} Apr 16 14:05:10.114132 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:10.114088 2575 generic.go:358] "Generic (PLEG): container finished" podID="a7d99400-9823-42a2-b53c-2cbeaa2a0e43" containerID="58a2f8bc0c5c0403cb731e95906c5197abc894955d367316559c36d92915497e" exitCode=0 Apr 16 14:05:10.114505 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:10.114158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" event={"ID":"a7d99400-9823-42a2-b53c-2cbeaa2a0e43","Type":"ContainerDied","Data":"58a2f8bc0c5c0403cb731e95906c5197abc894955d367316559c36d92915497e"} Apr 16 14:05:11.238969 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:11.238941 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:05:11.279290 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:11.279253 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-util\") pod \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " Apr 16 14:05:11.279478 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:11.279318 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-bundle\") pod \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " Apr 16 14:05:11.279478 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:11.279342 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8prm\" (UniqueName: \"kubernetes.io/projected/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-kube-api-access-d8prm\") pod \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\" (UID: \"a7d99400-9823-42a2-b53c-2cbeaa2a0e43\") " Apr 16 14:05:11.279905 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:11.279868 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-bundle" (OuterVolumeSpecName: "bundle") pod "a7d99400-9823-42a2-b53c-2cbeaa2a0e43" (UID: "a7d99400-9823-42a2-b53c-2cbeaa2a0e43"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:05:11.281616 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:11.281588 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-kube-api-access-d8prm" (OuterVolumeSpecName: "kube-api-access-d8prm") pod "a7d99400-9823-42a2-b53c-2cbeaa2a0e43" (UID: "a7d99400-9823-42a2-b53c-2cbeaa2a0e43"). InnerVolumeSpecName "kube-api-access-d8prm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:05:11.283418 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:11.283270 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-util" (OuterVolumeSpecName: "util") pod "a7d99400-9823-42a2-b53c-2cbeaa2a0e43" (UID: "a7d99400-9823-42a2-b53c-2cbeaa2a0e43"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:05:11.380214 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:11.380123 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-util\") on node \"ip-10-0-131-61.ec2.internal\" DevicePath \"\"" Apr 16 14:05:11.380214 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:11.380153 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-bundle\") on node \"ip-10-0-131-61.ec2.internal\" DevicePath \"\"" Apr 16 14:05:11.380214 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:11.380180 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d8prm\" (UniqueName: \"kubernetes.io/projected/a7d99400-9823-42a2-b53c-2cbeaa2a0e43-kube-api-access-d8prm\") on node \"ip-10-0-131-61.ec2.internal\" DevicePath \"\"" Apr 16 14:05:12.121504 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:12.121465 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" event={"ID":"a7d99400-9823-42a2-b53c-2cbeaa2a0e43","Type":"ContainerDied","Data":"c76ad0decf84a12d654d55c9d9a1de0bdad866d05d2c2c67dced54132a8ae6ce"} Apr 16 14:05:12.121504 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:12.121504 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c76ad0decf84a12d654d55c9d9a1de0bdad866d05d2c2c67dced54132a8ae6ce" Apr 16 14:05:12.121707 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:12.121548 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmzxrm" Apr 16 14:05:21.369792 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.369754 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-kbbk2"] Apr 16 14:05:21.370191 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.370005 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7d99400-9823-42a2-b53c-2cbeaa2a0e43" containerName="util" Apr 16 14:05:21.370191 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.370017 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d99400-9823-42a2-b53c-2cbeaa2a0e43" containerName="util" Apr 16 14:05:21.370191 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.370032 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7d99400-9823-42a2-b53c-2cbeaa2a0e43" containerName="pull" Apr 16 14:05:21.370191 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.370037 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d99400-9823-42a2-b53c-2cbeaa2a0e43" containerName="pull" Apr 16 14:05:21.370191 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.370046 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7d99400-9823-42a2-b53c-2cbeaa2a0e43" containerName="extract" Apr 16 14:05:21.370191 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.370052 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d99400-9823-42a2-b53c-2cbeaa2a0e43" containerName="extract" Apr 16 14:05:21.370191 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.370092 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7d99400-9823-42a2-b53c-2cbeaa2a0e43" containerName="extract" Apr 16 14:05:21.372228 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.372212 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-kbbk2" Apr 16 14:05:21.374677 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.374653 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 14:05:21.374677 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.374676 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 14:05:21.374820 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.374724 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 14:05:21.375513 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.375494 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-xlfs9\"" Apr 16 14:05:21.375612 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.375527 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 14:05:21.386545 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.386517 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-kbbk2"] Apr 16 14:05:21.454889 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.454850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/53149054-a885-4300-9e33-8bf71f8062dc-certificates\") pod \"keda-admission-cf49989db-kbbk2\" (UID: \"53149054-a885-4300-9e33-8bf71f8062dc\") " pod="openshift-keda/keda-admission-cf49989db-kbbk2" Apr 16 14:05:21.454889 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.454914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh9jx\" (UniqueName: \"kubernetes.io/projected/53149054-a885-4300-9e33-8bf71f8062dc-kube-api-access-rh9jx\") pod \"keda-admission-cf49989db-kbbk2\" (UID: \"53149054-a885-4300-9e33-8bf71f8062dc\") " pod="openshift-keda/keda-admission-cf49989db-kbbk2" Apr 16 14:05:21.556152 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.556116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/53149054-a885-4300-9e33-8bf71f8062dc-certificates\") pod \"keda-admission-cf49989db-kbbk2\" (UID: \"53149054-a885-4300-9e33-8bf71f8062dc\") " pod="openshift-keda/keda-admission-cf49989db-kbbk2" Apr 16 14:05:21.556152 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.556156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rh9jx\" (UniqueName: \"kubernetes.io/projected/53149054-a885-4300-9e33-8bf71f8062dc-kube-api-access-rh9jx\") pod \"keda-admission-cf49989db-kbbk2\" (UID: \"53149054-a885-4300-9e33-8bf71f8062dc\") " pod="openshift-keda/keda-admission-cf49989db-kbbk2" Apr 16 14:05:21.558602 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.558578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/53149054-a885-4300-9e33-8bf71f8062dc-certificates\") pod \"keda-admission-cf49989db-kbbk2\" (UID: \"53149054-a885-4300-9e33-8bf71f8062dc\") " pod="openshift-keda/keda-admission-cf49989db-kbbk2" Apr 16 14:05:21.564366 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.564340 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh9jx\" (UniqueName: \"kubernetes.io/projected/53149054-a885-4300-9e33-8bf71f8062dc-kube-api-access-rh9jx\") pod \"keda-admission-cf49989db-kbbk2\" (UID: \"53149054-a885-4300-9e33-8bf71f8062dc\") " pod="openshift-keda/keda-admission-cf49989db-kbbk2" Apr 16 14:05:21.682436 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.682332 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-kbbk2" Apr 16 14:05:21.804175 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:21.804135 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-kbbk2"] Apr 16 14:05:21.808233 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:05:21.808203 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53149054_a885_4300_9e33_8bf71f8062dc.slice/crio-2544083efd5bbf45eb381c3092494c641d914f2d2ac86f9997bacde6ce7c6736 WatchSource:0}: Error finding container 2544083efd5bbf45eb381c3092494c641d914f2d2ac86f9997bacde6ce7c6736: Status 404 returned error can't find the container with id 2544083efd5bbf45eb381c3092494c641d914f2d2ac86f9997bacde6ce7c6736 Apr 16 14:05:22.148432 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:22.148393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-kbbk2" event={"ID":"53149054-a885-4300-9e33-8bf71f8062dc","Type":"ContainerStarted","Data":"2544083efd5bbf45eb381c3092494c641d914f2d2ac86f9997bacde6ce7c6736"} Apr 16 14:05:24.155177 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:24.155137 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-kbbk2" event={"ID":"53149054-a885-4300-9e33-8bf71f8062dc","Type":"ContainerStarted","Data":"baebf67c8e7493ef6724ba11cdd2242ad44fe522c0f1437fabe3f3ce5bd246e4"} Apr 16 14:05:24.155578 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:24.155261 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-kbbk2" Apr 16 14:05:24.170974 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:24.170930 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-kbbk2" podStartSLOduration=1.710731262 podStartE2EDuration="3.170914821s" podCreationTimestamp="2026-04-16 14:05:21 +0000 UTC" firstStartedPulling="2026-04-16 14:05:21.809606183 +0000 UTC m=+485.537659886" lastFinishedPulling="2026-04-16 14:05:23.269789742 +0000 UTC m=+486.997843445" observedRunningTime="2026-04-16 14:05:24.170313709 +0000 UTC m=+487.898367435" watchObservedRunningTime="2026-04-16 14:05:24.170914821 +0000 UTC m=+487.898968537" Apr 16 14:05:45.159871 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:05:45.159837 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-kbbk2" Apr 16 14:06:27.790836 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.790799 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-62gjk"] Apr 16 14:06:27.792655 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.792638 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:06:27.795021 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.794992 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:06:27.795173 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.795039 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-rjm99\"" Apr 16 14:06:27.795777 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.795758 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 14:06:27.795856 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.795838 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:06:27.801343 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.801320 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x"] Apr 16 14:06:27.803170 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.803151 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" Apr 16 14:06:27.805354 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.805333 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 14:06:27.805451 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.805380 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-7cx8q\"" Apr 16 14:06:27.811495 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.811468 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-62gjk"] Apr 16 14:06:27.814644 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.814610 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x"] Apr 16 14:06:27.835381 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.835345 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbqjv\" (UniqueName: \"kubernetes.io/projected/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-kube-api-access-kbqjv\") pod \"kserve-controller-manager-9bbf58456-62gjk\" (UID: \"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648\") " pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:06:27.835381 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.835384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8sn5x\" (UID: \"6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" Apr 16 14:06:27.835586 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.835402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-cert\") pod \"kserve-controller-manager-9bbf58456-62gjk\" (UID: \"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648\") " pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:06:27.835586 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.835449 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj6lt\" (UniqueName: \"kubernetes.io/projected/6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1-kube-api-access-xj6lt\") pod \"llmisvc-controller-manager-68cc5db7c4-8sn5x\" (UID: \"6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" Apr 16 14:06:27.936592 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.936548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbqjv\" (UniqueName: \"kubernetes.io/projected/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-kube-api-access-kbqjv\") pod \"kserve-controller-manager-9bbf58456-62gjk\" (UID: \"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648\") " pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:06:27.936796 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.936608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8sn5x\" (UID: \"6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" Apr 16 14:06:27.936796 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.936637 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-cert\") pod \"kserve-controller-manager-9bbf58456-62gjk\" (UID: \"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648\") " pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:06:27.936796 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.936665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xj6lt\" (UniqueName: \"kubernetes.io/projected/6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1-kube-api-access-xj6lt\") pod \"llmisvc-controller-manager-68cc5db7c4-8sn5x\" (UID: \"6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" Apr 16 14:06:27.939142 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.939118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-cert\") pod \"kserve-controller-manager-9bbf58456-62gjk\" (UID: \"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648\") " pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:06:27.939255 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.939209 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8sn5x\" (UID: \"6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" Apr 16 14:06:27.947861 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.947824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbqjv\" (UniqueName: \"kubernetes.io/projected/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-kube-api-access-kbqjv\") pod \"kserve-controller-manager-9bbf58456-62gjk\" (UID: \"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648\") " pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:06:27.948786 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:27.948758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj6lt\" (UniqueName: \"kubernetes.io/projected/6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1-kube-api-access-xj6lt\") pod \"llmisvc-controller-manager-68cc5db7c4-8sn5x\" (UID: \"6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" Apr 16 14:06:28.104240 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:28.104139 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:06:28.113196 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:28.113170 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" Apr 16 14:06:28.236610 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:28.236564 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-62gjk"] Apr 16 14:06:28.240558 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:06:28.240522 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31a2ee1f_25bf_4ec8_8dd9_ddca7a14c648.slice/crio-b9c2a470fd8f9be95c6ef27891d9016f37225b8c3bb66d09e1531fbebfe1ea9b WatchSource:0}: Error finding container b9c2a470fd8f9be95c6ef27891d9016f37225b8c3bb66d09e1531fbebfe1ea9b: Status 404 returned error can't find the container with id b9c2a470fd8f9be95c6ef27891d9016f37225b8c3bb66d09e1531fbebfe1ea9b Apr 16 14:06:28.257139 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:28.257108 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x"] Apr 16 14:06:28.260254 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:06:28.260223 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6aa3bd24_ef02_4e1f_ad0a_aaa28507a7e1.slice/crio-a72eacd2d748e8f5c10f2bee8dc0fe1efd68d0fac32e3f9a83fc06c954d749c6 WatchSource:0}: Error finding container a72eacd2d748e8f5c10f2bee8dc0fe1efd68d0fac32e3f9a83fc06c954d749c6: Status 404 returned error can't find the container with id a72eacd2d748e8f5c10f2bee8dc0fe1efd68d0fac32e3f9a83fc06c954d749c6 Apr 16 14:06:28.323750 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:28.323705 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" event={"ID":"6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1","Type":"ContainerStarted","Data":"a72eacd2d748e8f5c10f2bee8dc0fe1efd68d0fac32e3f9a83fc06c954d749c6"} Apr 16 14:06:28.324792 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:28.324770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" event={"ID":"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648","Type":"ContainerStarted","Data":"b9c2a470fd8f9be95c6ef27891d9016f37225b8c3bb66d09e1531fbebfe1ea9b"} Apr 16 14:06:32.338655 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:32.338612 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" event={"ID":"6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1","Type":"ContainerStarted","Data":"fe9cf2720d7acdb97a93782a47c85f7a062b0b00d32a0d79a4894d0dbf75716f"} Apr 16 14:06:32.339119 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:32.338687 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" Apr 16 14:06:32.340073 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:32.340046 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" event={"ID":"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648","Type":"ContainerStarted","Data":"5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f"} Apr 16 14:06:32.340180 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:32.340168 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:06:32.360270 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:06:32.360211 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" podStartSLOduration=2.205612062 podStartE2EDuration="5.360192231s" podCreationTimestamp="2026-04-16 14:06:27 +0000 UTC" firstStartedPulling="2026-04-16 14:06:28.261485577 +0000 UTC m=+551.989539280" lastFinishedPulling="2026-04-16 14:06:31.416065746 +0000 UTC m=+555.144119449" observedRunningTime="2026-04-16 14:06:32.358960071 +0000 UTC m=+556.087013797" watchObservedRunningTime="2026-04-16 14:06:32.360192231 +0000 UTC m=+556.088245957" Apr 16 14:07:03.349002 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:03.348961 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8sn5x" Apr 16 14:07:03.351990 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:03.351966 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:07:03.367435 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:03.367375 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" podStartSLOduration=33.139658459 podStartE2EDuration="36.367359537s" podCreationTimestamp="2026-04-16 14:06:27 +0000 UTC" firstStartedPulling="2026-04-16 14:06:28.24183211 +0000 UTC m=+551.969885816" lastFinishedPulling="2026-04-16 14:06:31.46953317 +0000 UTC m=+555.197586894" observedRunningTime="2026-04-16 14:06:32.377120526 +0000 UTC m=+556.105174250" watchObservedRunningTime="2026-04-16 14:07:03.367359537 +0000 UTC m=+587.095413262" Apr 16 14:07:04.647133 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.647097 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-62gjk"] Apr 16 14:07:04.647545 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.647343 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" podUID="31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648" containerName="manager" containerID="cri-o://5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f" gracePeriod=10 Apr 16 14:07:04.671462 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.671420 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-b5rj7"] Apr 16 14:07:04.679846 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.679823 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" Apr 16 14:07:04.684058 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.684028 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-b5rj7"] Apr 16 14:07:04.819365 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.819337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8abc3003-95c6-478b-8564-80343be5671a-cert\") pod \"kserve-controller-manager-9bbf58456-b5rj7\" (UID: \"8abc3003-95c6-478b-8564-80343be5671a\") " pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" Apr 16 14:07:04.819504 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.819458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fslj\" (UniqueName: \"kubernetes.io/projected/8abc3003-95c6-478b-8564-80343be5671a-kube-api-access-5fslj\") pod \"kserve-controller-manager-9bbf58456-b5rj7\" (UID: \"8abc3003-95c6-478b-8564-80343be5671a\") " pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" Apr 16 14:07:04.894013 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.893987 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:07:04.920000 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.919876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8abc3003-95c6-478b-8564-80343be5671a-cert\") pod \"kserve-controller-manager-9bbf58456-b5rj7\" (UID: \"8abc3003-95c6-478b-8564-80343be5671a\") " pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" Apr 16 14:07:04.920219 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.920024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fslj\" (UniqueName: \"kubernetes.io/projected/8abc3003-95c6-478b-8564-80343be5671a-kube-api-access-5fslj\") pod \"kserve-controller-manager-9bbf58456-b5rj7\" (UID: \"8abc3003-95c6-478b-8564-80343be5671a\") " pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" Apr 16 14:07:04.922934 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.922883 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8abc3003-95c6-478b-8564-80343be5671a-cert\") pod \"kserve-controller-manager-9bbf58456-b5rj7\" (UID: \"8abc3003-95c6-478b-8564-80343be5671a\") " pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" Apr 16 14:07:04.927888 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:04.927858 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fslj\" (UniqueName: \"kubernetes.io/projected/8abc3003-95c6-478b-8564-80343be5671a-kube-api-access-5fslj\") pod \"kserve-controller-manager-9bbf58456-b5rj7\" (UID: \"8abc3003-95c6-478b-8564-80343be5671a\") " pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" Apr 16 14:07:05.021168 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.021129 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbqjv\" (UniqueName: \"kubernetes.io/projected/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-kube-api-access-kbqjv\") pod \"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648\" (UID: \"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648\") " Apr 16 14:07:05.021333 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.021190 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-cert\") pod \"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648\" (UID: \"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648\") " Apr 16 14:07:05.022665 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.022625 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" Apr 16 14:07:05.023548 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.023521 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-cert" (OuterVolumeSpecName: "cert") pod "31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648" (UID: "31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:07:05.023620 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.023550 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-kube-api-access-kbqjv" (OuterVolumeSpecName: "kube-api-access-kbqjv") pod "31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648" (UID: "31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648"). InnerVolumeSpecName "kube-api-access-kbqjv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:07:05.122216 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.122178 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kbqjv\" (UniqueName: \"kubernetes.io/projected/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-kube-api-access-kbqjv\") on node \"ip-10-0-131-61.ec2.internal\" DevicePath \"\"" Apr 16 14:07:05.122216 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.122207 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648-cert\") on node \"ip-10-0-131-61.ec2.internal\" DevicePath \"\"" Apr 16 14:07:05.139932 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.139873 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-b5rj7"] Apr 16 14:07:05.143232 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:07:05.143201 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8abc3003_95c6_478b_8564_80343be5671a.slice/crio-2bf33a256ad25653f5f1d98ae11ac4a49b856d6f5f03cde61a3aa85d7faa6f2a WatchSource:0}: Error finding container 2bf33a256ad25653f5f1d98ae11ac4a49b856d6f5f03cde61a3aa85d7faa6f2a: Status 404 returned error can't find the container with id 2bf33a256ad25653f5f1d98ae11ac4a49b856d6f5f03cde61a3aa85d7faa6f2a Apr 16 14:07:05.430985 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.430951 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" event={"ID":"8abc3003-95c6-478b-8564-80343be5671a","Type":"ContainerStarted","Data":"2bf33a256ad25653f5f1d98ae11ac4a49b856d6f5f03cde61a3aa85d7faa6f2a"} Apr 16 14:07:05.432166 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.432137 2575 generic.go:358] "Generic (PLEG): container finished" podID="31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648" containerID="5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f" exitCode=0 Apr 16 14:07:05.432275 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.432210 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" Apr 16 14:07:05.432275 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.432222 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" event={"ID":"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648","Type":"ContainerDied","Data":"5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f"} Apr 16 14:07:05.432275 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.432256 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-62gjk" event={"ID":"31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648","Type":"ContainerDied","Data":"b9c2a470fd8f9be95c6ef27891d9016f37225b8c3bb66d09e1531fbebfe1ea9b"} Apr 16 14:07:05.432397 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.432278 2575 scope.go:117] "RemoveContainer" containerID="5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f" Apr 16 14:07:05.440653 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.440631 2575 scope.go:117] "RemoveContainer" containerID="5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f" Apr 16 14:07:05.440986 ip-10-0-131-61 kubenswrapper[2575]: E0416 14:07:05.440956 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f\": container with ID starting with 5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f not found: ID does not exist" containerID="5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f" Apr 16 14:07:05.441083 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.440998 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f"} err="failed to get container status \"5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f\": rpc error: code = NotFound desc = could not find container \"5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f\": container with ID starting with 5d6d59f19687498130eff30d09627e007fccca92e1ce85a0baab28571e2d574f not found: ID does not exist" Apr 16 14:07:05.453846 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.453816 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-62gjk"] Apr 16 14:07:05.455321 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:05.455297 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-9bbf58456-62gjk"] Apr 16 14:07:06.435998 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:06.435964 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" event={"ID":"8abc3003-95c6-478b-8564-80343be5671a","Type":"ContainerStarted","Data":"240aa4d0e7213260fe1d82f98b3380d18ecafd1cf7f423f70757c752130cc6a1"} Apr 16 14:07:06.436450 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:06.436071 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" Apr 16 14:07:06.453053 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:06.452994 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" podStartSLOduration=1.867625815 podStartE2EDuration="2.452977057s" podCreationTimestamp="2026-04-16 14:07:04 +0000 UTC" firstStartedPulling="2026-04-16 14:07:05.14446364 +0000 UTC m=+588.872517343" lastFinishedPulling="2026-04-16 14:07:05.729814881 +0000 UTC m=+589.457868585" observedRunningTime="2026-04-16 14:07:06.451335547 +0000 UTC m=+590.179389269" watchObservedRunningTime="2026-04-16 14:07:06.452977057 +0000 UTC m=+590.181030760" Apr 16 14:07:06.853476 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:06.853444 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648" path="/var/lib/kubelet/pods/31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648/volumes" Apr 16 14:07:16.766297 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:16.766268 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 14:07:16.766677 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:16.766406 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 14:07:37.444687 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:07:37.444609 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-9bbf58456-b5rj7" Apr 16 14:12:16.783739 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:12:16.783659 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 14:12:16.784756 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:12:16.784719 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 14:17:16.803584 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:17:16.803555 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 14:17:16.804310 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:17:16.804180 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 14:22:16.826232 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:16.826204 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 14:22:16.827915 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:16.827883 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 14:22:23.122593 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:23.122556 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4xds7_f5b00d31-4f15-45a9-8cf4-a8506d766be0/global-pull-secret-syncer/0.log" Apr 16 14:22:23.270177 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:23.270142 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6r2ft_7e94555a-67da-4de5-ace1-024c7384ada8/konnectivity-agent/0.log" Apr 16 14:22:23.347640 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:23.347611 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-61.ec2.internal_aeb1bbb91159af60fd67f2df28938806/haproxy/0.log" Apr 16 14:22:27.486614 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:27.486583 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6dd6c4dfc8-pxgnk_3adc34d0-b3d2-4162-a395-be904e0f2cbe/metrics-server/0.log" Apr 16 14:22:27.715208 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:27.715170 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ppk6r_1de51b5e-b280-4669-ae37-f4318fdfda79/node-exporter/0.log" Apr 16 14:22:27.740970 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:27.740864 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ppk6r_1de51b5e-b280-4669-ae37-f4318fdfda79/kube-rbac-proxy/0.log" Apr 16 14:22:27.767966 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:27.767933 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ppk6r_1de51b5e-b280-4669-ae37-f4318fdfda79/init-textfile/0.log" Apr 16 14:22:30.484930 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.484827 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb"] Apr 16 14:22:30.485281 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.485127 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648" containerName="manager" Apr 16 14:22:30.485281 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.485141 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648" containerName="manager" Apr 16 14:22:30.485281 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.485206 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="31a2ee1f-25bf-4ec8-8dd9-ddca7a14c648" containerName="manager" Apr 16 14:22:30.487948 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.487920 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.490164 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.490140 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s6bn5\"/\"openshift-service-ca.crt\"" Apr 16 14:22:30.490862 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.490847 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s6bn5\"/\"kube-root-ca.crt\"" Apr 16 14:22:30.490925 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.490865 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-s6bn5\"/\"default-dockercfg-nnrnd\"" Apr 16 14:22:30.498088 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.498052 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb"] Apr 16 14:22:30.591881 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.591841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-lib-modules\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.591881 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.591879 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-proc\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.592109 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.591919 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfj5m\" (UniqueName: \"kubernetes.io/projected/e9813146-664f-4afc-a26a-0be610feb9af-kube-api-access-bfj5m\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.592109 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.591938 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-podres\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.592109 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.591971 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-sys\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.692711 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.692668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-lib-modules\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.692711 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.692711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-proc\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.692953 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.692729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfj5m\" (UniqueName: \"kubernetes.io/projected/e9813146-664f-4afc-a26a-0be610feb9af-kube-api-access-bfj5m\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.692953 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.692747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-podres\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.692953 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.692796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-sys\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.692953 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.692813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-proc\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.692953 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.692845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-lib-modules\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.692953 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.692880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-sys\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.693200 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.692968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e9813146-664f-4afc-a26a-0be610feb9af-podres\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.703547 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.703512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfj5m\" (UniqueName: \"kubernetes.io/projected/e9813146-664f-4afc-a26a-0be610feb9af-kube-api-access-bfj5m\") pod \"perf-node-gather-daemonset-mc4zb\" (UID: \"e9813146-664f-4afc-a26a-0be610feb9af\") " pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.797595 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.797567 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:30.913669 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.913551 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb"] Apr 16 14:22:30.916324 ip-10-0-131-61 kubenswrapper[2575]: W0416 14:22:30.916295 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode9813146_664f_4afc_a26a_0be610feb9af.slice/crio-562ab06d08fe67f7cc1715f78363ad87ca44d405264d7c1aca2921c17c821008 WatchSource:0}: Error finding container 562ab06d08fe67f7cc1715f78363ad87ca44d405264d7c1aca2921c17c821008: Status 404 returned error can't find the container with id 562ab06d08fe67f7cc1715f78363ad87ca44d405264d7c1aca2921c17c821008 Apr 16 14:22:30.917816 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.917799 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:22:30.942801 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:30.942771 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" event={"ID":"e9813146-664f-4afc-a26a-0be610feb9af","Type":"ContainerStarted","Data":"562ab06d08fe67f7cc1715f78363ad87ca44d405264d7c1aca2921c17c821008"} Apr 16 14:22:31.383592 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:31.383564 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6fksw_69050bd1-ea1f-49d8-8ca8-617d41938670/dns/0.log" Apr 16 14:22:31.407164 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:31.407135 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6fksw_69050bd1-ea1f-49d8-8ca8-617d41938670/kube-rbac-proxy/0.log" Apr 16 14:22:31.539689 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:31.539657 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h7pm6_af33c15c-6386-4fe7-9155-a5c6b6e05ec4/dns-node-resolver/0.log" Apr 16 14:22:31.947252 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:31.947218 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" event={"ID":"e9813146-664f-4afc-a26a-0be610feb9af","Type":"ContainerStarted","Data":"d3c223f0e435beb3fb708a5cd397e4558c1df30ff615861292a225735e4a5d34"} Apr 16 14:22:31.947436 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:31.947312 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:31.963447 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:31.963400 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" podStartSLOduration=1.963381909 podStartE2EDuration="1.963381909s" podCreationTimestamp="2026-04-16 14:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:22:31.961560245 +0000 UTC m=+1515.689613969" watchObservedRunningTime="2026-04-16 14:22:31.963381909 +0000 UTC m=+1515.691435691" Apr 16 14:22:32.012554 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:32.012513 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-567fc49968-7hrgm_6cc8cd19-467f-45a6-aedd-568b2ea53b3d/registry/0.log" Apr 16 14:22:32.059442 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:32.059416 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9tz2x_03875a5c-704e-42af-9ea6-ba5a9f181d94/node-ca/0.log" Apr 16 14:22:33.150599 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:33.150571 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xdwfc_adf72fd1-6177-4cfa-a1c8-0b8bb22dba4d/serve-healthcheck-canary/0.log" Apr 16 14:22:33.637534 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:33.637508 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-slzqv_b80ca789-264f-4eb3-8fea-ea24dbc639cd/kube-rbac-proxy/0.log" Apr 16 14:22:33.661020 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:33.660991 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-slzqv_b80ca789-264f-4eb3-8fea-ea24dbc639cd/exporter/0.log" Apr 16 14:22:33.689328 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:33.689300 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-slzqv_b80ca789-264f-4eb3-8fea-ea24dbc639cd/extractor/0.log" Apr 16 14:22:35.675529 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:35.675498 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-9bbf58456-b5rj7_8abc3003-95c6-478b-8564-80343be5671a/manager/0.log" Apr 16 14:22:35.696780 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:35.696756 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-8sn5x_6aa3bd24-ef02-4e1f-ad0a-aaa28507a7e1/manager/0.log" Apr 16 14:22:37.958739 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:37.958713 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-s6bn5/perf-node-gather-daemonset-mc4zb" Apr 16 14:22:41.231787 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:41.231757 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4wgkv_0675f452-3368-4384-83f9-0c6a166ba947/kube-multus/0.log" Apr 16 14:22:41.289411 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:41.289379 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mj78x_9d11af49-8358-49ac-ac63-a39ae3da3f3c/kube-multus-additional-cni-plugins/0.log" Apr 16 14:22:41.327708 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:41.327677 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mj78x_9d11af49-8358-49ac-ac63-a39ae3da3f3c/egress-router-binary-copy/0.log" Apr 16 14:22:41.351683 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:41.351651 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mj78x_9d11af49-8358-49ac-ac63-a39ae3da3f3c/cni-plugins/0.log" Apr 16 14:22:41.375294 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:41.375267 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mj78x_9d11af49-8358-49ac-ac63-a39ae3da3f3c/bond-cni-plugin/0.log" Apr 16 14:22:41.404398 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:41.404357 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mj78x_9d11af49-8358-49ac-ac63-a39ae3da3f3c/routeoverride-cni/0.log" Apr 16 14:22:41.429056 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:41.429030 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mj78x_9d11af49-8358-49ac-ac63-a39ae3da3f3c/whereabouts-cni-bincopy/0.log" Apr 16 14:22:41.454059 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:41.453991 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mj78x_9d11af49-8358-49ac-ac63-a39ae3da3f3c/whereabouts-cni/0.log" Apr 16 14:22:42.016295 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:42.016267 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l5zhh_9ab97e35-4539-405d-bb91-f30c906963c2/network-metrics-daemon/0.log" Apr 16 14:22:42.045845 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:42.045816 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l5zhh_9ab97e35-4539-405d-bb91-f30c906963c2/kube-rbac-proxy/0.log" Apr 16 14:22:43.647308 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:43.647276 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-controller/0.log" Apr 16 14:22:43.675632 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:43.675604 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/0.log" Apr 16 14:22:43.682516 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:43.682494 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovn-acl-logging/1.log" Apr 16 14:22:43.702236 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:43.702205 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/kube-rbac-proxy-node/0.log" Apr 16 14:22:43.730061 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:43.730029 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:22:43.750248 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:43.750224 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/northd/0.log" Apr 16 14:22:43.780484 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:43.780459 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/nbdb/0.log" Apr 16 14:22:43.806029 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:43.806007 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/sbdb/0.log" Apr 16 14:22:43.905438 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:43.905358 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w7c7s_ed8d29e9-b9f2-47f1-9d73-e5c3d0d2cdff/ovnkube-controller/0.log" Apr 16 14:22:44.965734 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:44.965705 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5zjkj_6e53c142-6f4e-4358-a390-6d3c43558ef6/network-check-target-container/0.log" Apr 16 14:22:46.070685 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:46.070658 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-jzfpd_5837407c-436b-4756-9677-8c40ff8e9059/iptables-alerter/0.log" Apr 16 14:22:46.875408 ip-10-0-131-61 kubenswrapper[2575]: I0416 14:22:46.875375 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-z8t6x_985ce2bb-b7a7-4b0c-893a-1840235f7653/tuned/0.log"