Apr 22 17:51:16.576053 ip-10-0-135-21 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 17:51:16.576062 ip-10-0-135-21 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 17:51:16.576069 ip-10-0-135-21 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 17:51:16.576345 ip-10-0-135-21 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 17:51:26.604601 ip-10-0-135-21 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 17:51:26.604616 ip-10-0-135-21 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot bd4674f1259c47c687eb61e2b0d26f80 -- Apr 22 17:53:51.898838 ip-10-0-135-21 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:53:52.413172 ip-10-0-135-21 kubenswrapper[2527]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:52.413172 ip-10-0-135-21 kubenswrapper[2527]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:53:52.413172 ip-10-0-135-21 kubenswrapper[2527]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:52.414232 ip-10-0-135-21 kubenswrapper[2527]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:53:52.414232 ip-10-0-135-21 kubenswrapper[2527]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:52.414734 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.414553 2527 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:53:52.417820 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417804 2527 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:52.417820 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417820 2527 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417823 2527 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417828 2527 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417831 2527 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417835 2527 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417838 2527 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417841 2527 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417844 2527 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417847 2527 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417850 2527 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417853 2527 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417855 2527 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417858 2527 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417861 2527 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417863 2527 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417866 2527 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417868 2527 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417876 2527 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417880 2527 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:52.417883 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417882 2527 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417885 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417888 2527 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417891 2527 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417894 2527 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417897 2527 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417899 2527 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417902 2527 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417906 2527 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417910 2527 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417913 2527 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417915 2527 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417918 2527 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417920 2527 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417923 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417925 2527 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417928 2527 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417931 2527 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417933 2527 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417936 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:52.418330 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417939 2527 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417942 2527 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417945 2527 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417949 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417951 2527 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417954 2527 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417956 2527 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417960 2527 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417963 2527 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417965 2527 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417968 2527 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417972 2527 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417976 2527 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417979 2527 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417981 2527 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417984 2527 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417987 2527 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417989 2527 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417992 2527 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:52.418867 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417995 2527 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.417997 2527 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418000 2527 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418003 2527 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418005 2527 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418008 2527 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418011 2527 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418015 2527 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418018 2527 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418021 2527 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418023 2527 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418026 2527 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418029 2527 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418031 2527 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418034 2527 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418037 2527 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418039 2527 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418042 2527 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418045 2527 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418047 2527 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:52.419327 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418050 2527 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418052 2527 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418055 2527 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418057 2527 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418060 2527 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418063 2527 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.418066 2527 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419314 2527 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419321 2527 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419324 2527 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419327 2527 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419331 2527 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419334 2527 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419337 2527 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419340 2527 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419343 2527 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419346 2527 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419348 2527 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419351 2527 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419355 2527 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:52.419835 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419357 2527 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419360 2527 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419363 2527 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419366 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419368 2527 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419371 2527 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419373 2527 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419376 2527 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419379 2527 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419381 2527 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419383 2527 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419386 2527 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419388 2527 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419391 2527 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419394 2527 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419396 2527 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419399 2527 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419401 2527 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419404 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:52.420310 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419407 2527 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419410 2527 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419413 2527 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419415 2527 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419418 2527 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419421 2527 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419423 2527 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419425 2527 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419428 2527 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419430 2527 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419433 2527 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419435 2527 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419438 2527 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419441 2527 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419444 2527 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419446 2527 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419448 2527 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419451 2527 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419453 2527 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419457 2527 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:52.420825 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419459 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419462 2527 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419464 2527 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419467 2527 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419470 2527 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419474 2527 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419476 2527 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419479 2527 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419482 2527 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419484 2527 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419487 2527 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419491 2527 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419493 2527 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419496 2527 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419498 2527 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419501 2527 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419503 2527 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419506 2527 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419508 2527 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419511 2527 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:52.421308 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419513 2527 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419516 2527 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419519 2527 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419522 2527 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419527 2527 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419530 2527 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419533 2527 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419536 2527 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419539 2527 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419541 2527 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419544 2527 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419546 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419549 2527 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.419552 2527 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419634 2527 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419642 2527 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419649 2527 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419654 2527 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419658 2527 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419662 2527 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419667 2527 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:53:52.421821 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419671 2527 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419675 2527 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419678 2527 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419682 2527 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419685 2527 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419688 2527 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419692 2527 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419695 2527 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419698 2527 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419700 2527 flags.go:64] FLAG: --cloud-config="" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419703 2527 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419706 2527 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419720 2527 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419723 2527 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419726 2527 flags.go:64] FLAG: --config-dir="" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419729 2527 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419733 2527 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419737 2527 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419746 2527 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419750 2527 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419753 2527 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419756 2527 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419759 2527 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419762 2527 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419765 2527 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:53:52.422325 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419768 2527 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419772 2527 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419775 2527 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419779 2527 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419782 2527 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419785 2527 flags.go:64] FLAG: --enable-server="true" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419788 2527 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419794 2527 flags.go:64] FLAG: --event-burst="100" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419798 2527 flags.go:64] FLAG: --event-qps="50" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419801 2527 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419804 2527 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419807 2527 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419811 2527 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419814 2527 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419817 2527 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419820 2527 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419823 2527 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419826 2527 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419829 2527 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419832 2527 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419835 2527 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419838 2527 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419841 2527 flags.go:64] FLAG: --feature-gates="" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419845 2527 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419848 2527 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:53:52.422951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419852 2527 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419860 2527 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419864 2527 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419867 2527 flags.go:64] FLAG: --help="false" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419870 2527 flags.go:64] FLAG: --hostname-override="ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419873 2527 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419876 2527 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419879 2527 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419883 2527 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419886 2527 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419889 2527 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419892 2527 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419895 2527 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419898 2527 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419901 2527 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419904 2527 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419907 2527 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419910 2527 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419913 2527 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419916 2527 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419919 2527 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419922 2527 flags.go:64] FLAG: --lock-file="" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419925 2527 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419928 2527 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:53:52.423549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419931 2527 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419936 2527 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419940 2527 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419943 2527 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419945 2527 flags.go:64] FLAG: --logging-format="text" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419948 2527 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419952 2527 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419955 2527 flags.go:64] FLAG: --manifest-url="" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419958 2527 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419962 2527 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419968 2527 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419973 2527 flags.go:64] FLAG: --max-pods="110" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419975 2527 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419978 2527 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419981 2527 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419984 2527 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419987 2527 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419990 2527 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.419993 2527 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420001 2527 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420004 2527 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420007 2527 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420010 2527 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:53:52.424133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420013 2527 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420018 2527 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420022 2527 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420025 2527 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420028 2527 flags.go:64] FLAG: --port="10250" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420031 2527 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420034 2527 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-078783d5e156f292a" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420037 2527 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420040 2527 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420043 2527 flags.go:64] FLAG: --register-node="true" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420045 2527 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420051 2527 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420054 2527 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420057 2527 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420060 2527 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420063 2527 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420066 2527 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420069 2527 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420072 2527 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420075 2527 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420085 2527 flags.go:64] FLAG: --runonce="false" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420088 2527 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420092 2527 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420094 2527 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420097 2527 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420100 2527 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:53:52.424710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420103 2527 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420106 2527 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420109 2527 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420112 2527 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420115 2527 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420118 2527 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420120 2527 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420123 2527 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420126 2527 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420130 2527 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420135 2527 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420138 2527 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420141 2527 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420145 2527 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420148 2527 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420151 2527 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420153 2527 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420161 2527 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420164 2527 flags.go:64] FLAG: --v="2" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420173 2527 flags.go:64] FLAG: --version="false" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420177 2527 flags.go:64] FLAG: --vmodule="" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420181 2527 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.420184 2527 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420277 2527 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:52.425338 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420281 2527 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420284 2527 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420287 2527 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420290 2527 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420292 2527 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420295 2527 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420298 2527 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420301 2527 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420303 2527 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420306 2527 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420308 2527 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420311 2527 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420314 2527 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420317 2527 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420320 2527 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420322 2527 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420325 2527 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420328 2527 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420330 2527 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420332 2527 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:52.425967 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420335 2527 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420338 2527 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420340 2527 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420343 2527 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420345 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420350 2527 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420352 2527 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420355 2527 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420357 2527 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420360 2527 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420362 2527 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420364 2527 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420367 2527 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420370 2527 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420372 2527 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420375 2527 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420378 2527 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420382 2527 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420386 2527 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:52.426788 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420388 2527 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420391 2527 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420394 2527 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420397 2527 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420400 2527 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420402 2527 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420405 2527 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420407 2527 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420410 2527 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420413 2527 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420415 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420418 2527 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420420 2527 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420423 2527 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420425 2527 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420428 2527 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420430 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420433 2527 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420436 2527 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:52.427716 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420439 2527 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420442 2527 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420445 2527 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420448 2527 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420450 2527 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420453 2527 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420455 2527 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420458 2527 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420460 2527 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420463 2527 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420466 2527 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420468 2527 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420471 2527 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420473 2527 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420476 2527 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420479 2527 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420481 2527 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420487 2527 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420490 2527 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420492 2527 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:52.428473 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420495 2527 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:52.428981 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420498 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:52.428981 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420500 2527 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:52.428981 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420503 2527 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:52.428981 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420505 2527 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:52.428981 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420508 2527 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:52.428981 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.420510 2527 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:52.428981 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.421175 2527 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:52.433090 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.433069 2527 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:53:52.433161 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.433093 2527 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:53:52.433210 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433178 2527 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:52.433210 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433187 2527 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:52.433210 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433192 2527 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:52.433210 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433197 2527 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:52.433210 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433203 2527 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:52.433210 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433208 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:52.433210 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433213 2527 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433218 2527 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433222 2527 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433227 2527 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433232 2527 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433236 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433240 2527 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433244 2527 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433249 2527 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433253 2527 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433257 2527 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433261 2527 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433266 2527 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433271 2527 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433275 2527 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433280 2527 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433284 2527 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433288 2527 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433292 2527 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433297 2527 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:52.433528 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433301 2527 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433305 2527 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433309 2527 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433313 2527 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433317 2527 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433322 2527 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433327 2527 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433331 2527 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433335 2527 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433339 2527 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433343 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433348 2527 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433352 2527 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433356 2527 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433362 2527 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433367 2527 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433371 2527 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433376 2527 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433380 2527 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433385 2527 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:52.434499 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433389 2527 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433393 2527 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433397 2527 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433402 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433406 2527 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433410 2527 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433415 2527 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433419 2527 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433423 2527 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433427 2527 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433432 2527 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433438 2527 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433445 2527 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433450 2527 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433454 2527 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433459 2527 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433463 2527 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433468 2527 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433473 2527 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433478 2527 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:52.435091 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433482 2527 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433489 2527 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433512 2527 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433518 2527 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433522 2527 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433527 2527 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433532 2527 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433537 2527 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433542 2527 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433547 2527 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433551 2527 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433556 2527 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433560 2527 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433581 2527 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433585 2527 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433588 2527 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433592 2527 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433596 2527 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433600 2527 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:52.435971 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433603 2527 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.433610 2527 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433789 2527 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433800 2527 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433805 2527 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433810 2527 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433814 2527 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433819 2527 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433822 2527 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433827 2527 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433832 2527 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433836 2527 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433841 2527 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433846 2527 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433850 2527 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433854 2527 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:52.436689 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433858 2527 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433863 2527 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433867 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433871 2527 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433875 2527 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433880 2527 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433884 2527 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433888 2527 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433892 2527 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433896 2527 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433900 2527 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433905 2527 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433909 2527 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433913 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433917 2527 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433921 2527 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433925 2527 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433929 2527 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433933 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433937 2527 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:52.437079 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433941 2527 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433946 2527 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433950 2527 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433955 2527 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433959 2527 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433963 2527 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433967 2527 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433972 2527 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433976 2527 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433980 2527 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433985 2527 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433990 2527 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433994 2527 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.433998 2527 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434002 2527 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434006 2527 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434010 2527 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434014 2527 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434018 2527 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434022 2527 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:52.437683 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434026 2527 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434030 2527 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434035 2527 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434040 2527 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434044 2527 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434048 2527 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434052 2527 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434056 2527 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434060 2527 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434066 2527 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434071 2527 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434075 2527 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434080 2527 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434084 2527 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434088 2527 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434092 2527 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434097 2527 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434100 2527 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434104 2527 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:52.438244 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434109 2527 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434113 2527 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434117 2527 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434122 2527 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434126 2527 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434130 2527 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434135 2527 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434139 2527 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434143 2527 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434149 2527 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434155 2527 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434159 2527 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:52.434164 2527 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.434172 2527 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.434333 2527 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:53:52.438719 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.437598 2527 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:53:52.439125 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.438753 2527 server.go:1019] "Starting client certificate rotation" Apr 22 17:53:52.439125 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.438862 2527 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:52.439701 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.439686 2527 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:52.470806 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.470780 2527 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:52.476952 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.476781 2527 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:52.493372 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.493353 2527 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:53:52.498903 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.498885 2527 log.go:25] "Validated CRI v1 image API" Apr 22 17:53:52.500110 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.500096 2527 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:53:52.506245 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.506222 2527 fs.go:135] Filesystem UUIDs: map[4ebf025f-89f6-4d54-9f20-92b75960551f:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8a69b767-a315-44a6-9de4-2ada7f664374:/dev/nvme0n1p4] Apr 22 17:53:52.506324 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.506244 2527 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:53:52.509047 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.509030 2527 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:52.513928 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.513819 2527 manager.go:217] Machine: {Timestamp:2026-04-22 17:53:52.510800912 +0000 UTC m=+0.474089974 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2499994 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec213fb37733bf0806a0440d322a17d5 SystemUUID:ec213fb3-7733-bf08-06a0-440d322a17d5 BootID:bd4674f1-259c-47c6-87eb-61e2b0d26f80 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1e:26:9d:f1:01 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1e:26:9d:f1:01 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:46:a2:03:dd:c6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:53:52.513928 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.513924 2527 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:53:52.514022 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.514011 2527 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:53:52.516784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.516758 2527 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:53:52.516938 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.516788 2527 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-21.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:53:52.516979 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.516947 2527 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:53:52.516979 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.516956 2527 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:53:52.516979 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.516969 2527 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:52.518463 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.518453 2527 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:52.519293 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.519283 2527 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:52.519411 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.519402 2527 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:53:52.522377 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.522368 2527 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:53:52.522410 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.522382 2527 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:53:52.522410 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.522396 2527 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:53:52.522476 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.522448 2527 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:53:52.522476 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.522457 2527 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:53:52.523524 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.523513 2527 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:52.523580 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.523531 2527 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:52.526685 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.526668 2527 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:53:52.528164 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.528149 2527 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:53:52.530106 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530093 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:53:52.530166 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530111 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:53:52.530166 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530118 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:53:52.530166 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530123 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:53:52.530166 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530129 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:53:52.530166 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530135 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:53:52.530166 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530143 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:53:52.530166 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530151 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:53:52.530166 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530159 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:53:52.530166 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530166 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:53:52.530404 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530179 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:53:52.530404 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.530188 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:53:52.531160 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.531149 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:53:52.531204 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.531162 2527 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:53:52.535959 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.535944 2527 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:53:52.536049 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.535981 2527 server.go:1295] "Started kubelet" Apr 22 17:53:52.537174 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.536069 2527 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:53:52.537430 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.537391 2527 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:53:52.537482 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.537451 2527 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:53:52.537721 ip-10-0-135-21 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:53:52.538709 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.538463 2527 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-21.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:53:52.538921 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.538752 2527 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-21.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:53:52.538921 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.538808 2527 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:53:52.539026 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.538971 2527 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:53:52.542019 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.541991 2527 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xn8rr" Apr 22 17:53:52.542254 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.542242 2527 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:53:52.545408 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.544246 2527 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-21.ec2.internal.18a8bf56cda5ec3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-21.ec2.internal,UID:ip-10-0-135-21.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-21.ec2.internal,},FirstTimestamp:2026-04-22 17:53:52.535956541 +0000 UTC m=+0.499245602,LastTimestamp:2026-04-22 17:53:52.535956541 +0000 UTC m=+0.499245602,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-21.ec2.internal,}" Apr 22 17:53:52.548044 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.548025 2527 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:53:52.548139 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.548045 2527 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:52.548590 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.548554 2527 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:53:52.548684 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.548668 2527 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:53:52.548735 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.548691 2527 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:53:52.548735 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.548666 2527 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:53:52.548824 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.548811 2527 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:53:52.548824 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.548820 2527 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:53:52.548929 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.548828 2527 factory.go:55] Registering systemd factory Apr 22 17:53:52.548929 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.548843 2527 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:53:52.548929 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.548921 2527 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:52.549070 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.549007 2527 factory.go:153] Registering CRI-O factory Apr 22 17:53:52.549070 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.549017 2527 factory.go:223] Registration of the crio container factory successfully Apr 22 17:53:52.549070 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.549062 2527 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:53:52.549199 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.549081 2527 factory.go:103] Registering Raw factory Apr 22 17:53:52.549199 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.549097 2527 manager.go:1196] Started watching for new ooms in manager Apr 22 17:53:52.549635 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.549617 2527 manager.go:319] Starting recovery of all containers Apr 22 17:53:52.553127 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.553104 2527 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xn8rr" Apr 22 17:53:52.558061 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.558036 2527 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 17:53:52.558298 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.558262 2527 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-21.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 17:53:52.558625 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.558607 2527 manager.go:324] Recovery completed Apr 22 17:53:52.560296 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.560268 2527 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 17:53:52.563233 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.563221 2527 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:52.567077 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.567062 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:52.567138 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.567088 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:52.567138 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.567099 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:52.567677 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.567661 2527 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:53:52.567677 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.567676 2527 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:53:52.567778 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.567718 2527 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:52.570218 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.570207 2527 policy_none.go:49] "None policy: Start" Apr 22 17:53:52.570254 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.570222 2527 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:53:52.570254 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.570232 2527 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:53:52.621317 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.621301 2527 manager.go:341] "Starting Device Plugin manager" Apr 22 17:53:52.625761 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.621358 2527 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:53:52.625761 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.621372 2527 server.go:85] "Starting device plugin registration server" Apr 22 17:53:52.625761 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.621698 2527 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:53:52.625761 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.621712 2527 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:53:52.625761 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.621809 2527 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:53:52.625761 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.621898 2527 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:53:52.625761 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.621908 2527 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:53:52.625761 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.622414 2527 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:53:52.625761 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.622446 2527 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:52.692133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.692067 2527 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:53:52.693251 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.693233 2527 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:53:52.693342 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.693262 2527 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:53:52.693342 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.693285 2527 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:53:52.693342 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.693293 2527 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:53:52.693448 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.693368 2527 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:53:52.695667 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.695651 2527 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:52.722644 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.722596 2527 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:52.723463 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.723446 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:52.723554 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.723483 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:52.723554 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.723498 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:52.723554 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.723527 2527 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.730642 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.730626 2527 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.730703 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.730648 2527 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-21.ec2.internal\": node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:52.747129 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.747108 2527 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:52.793667 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.793632 2527 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-21.ec2.internal"] Apr 22 17:53:52.793811 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.793719 2527 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:52.798736 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.798719 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:52.798827 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.798749 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:52.798827 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.798759 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:52.800191 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.800178 2527 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:52.800310 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.800297 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.800350 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.800325 2527 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:52.800965 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.800937 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:52.801072 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.800937 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:52.801072 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.800995 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:52.801072 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.801010 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:52.801072 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.800975 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:52.801072 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.801046 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:52.802301 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.802286 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.802350 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.802312 2527 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:52.803055 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.803041 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:52.803128 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.803066 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:52.803128 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.803079 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:52.835364 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.835340 2527 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-21.ec2.internal\" not found" node="ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.839759 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.839744 2527 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-21.ec2.internal\" not found" node="ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.847801 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.847784 2527 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:52.850022 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.849996 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/18cf1a3f2e2dc1df47f20b03f0bf7e15-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal\" (UID: \"18cf1a3f2e2dc1df47f20b03f0bf7e15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.850078 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.850031 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18cf1a3f2e2dc1df47f20b03f0bf7e15-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal\" (UID: \"18cf1a3f2e2dc1df47f20b03f0bf7e15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.850078 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.850047 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7c86caefdc1dda70cffe5878d33236b6-config\") pod \"kube-apiserver-proxy-ip-10-0-135-21.ec2.internal\" (UID: \"7c86caefdc1dda70cffe5878d33236b6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.948420 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:52.948349 2527 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:52.950553 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.950537 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7c86caefdc1dda70cffe5878d33236b6-config\") pod \"kube-apiserver-proxy-ip-10-0-135-21.ec2.internal\" (UID: \"7c86caefdc1dda70cffe5878d33236b6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.950648 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.950582 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/18cf1a3f2e2dc1df47f20b03f0bf7e15-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal\" (UID: \"18cf1a3f2e2dc1df47f20b03f0bf7e15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.950648 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.950640 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/18cf1a3f2e2dc1df47f20b03f0bf7e15-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal\" (UID: \"18cf1a3f2e2dc1df47f20b03f0bf7e15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.950752 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.950673 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18cf1a3f2e2dc1df47f20b03f0bf7e15-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal\" (UID: \"18cf1a3f2e2dc1df47f20b03f0bf7e15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.950752 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.950641 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7c86caefdc1dda70cffe5878d33236b6-config\") pod \"kube-apiserver-proxy-ip-10-0-135-21.ec2.internal\" (UID: \"7c86caefdc1dda70cffe5878d33236b6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-21.ec2.internal" Apr 22 17:53:52.950752 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:52.950713 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18cf1a3f2e2dc1df47f20b03f0bf7e15-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal\" (UID: \"18cf1a3f2e2dc1df47f20b03f0bf7e15\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" Apr 22 17:53:53.048443 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:53.048410 2527 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:53.137885 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.137859 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" Apr 22 17:53:53.142440 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.142417 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-21.ec2.internal" Apr 22 17:53:53.149156 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:53.149132 2527 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:53.249710 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:53.249624 2527 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:53.350169 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:53.350014 2527 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:53.438388 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.438354 2527 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:53:53.439038 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.438509 2527 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:53.450700 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:53.450671 2527 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:53.548213 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.548184 2527 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:53.551437 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:53.551422 2527 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:53.557136 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.557111 2527 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:48:52 +0000 UTC" deadline="2027-12-08 21:24:13.091655537 +0000 UTC" Apr 22 17:53:53.557136 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.557134 2527 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14283h30m19.534524621s" Apr 22 17:53:53.560802 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.560785 2527 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:53.578279 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.578256 2527 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wfd4f" Apr 22 17:53:53.587101 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.587085 2527 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wfd4f" Apr 22 17:53:53.630417 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.630385 2527 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:53.651891 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:53.651863 2527 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-21.ec2.internal\" not found" Apr 22 17:53:53.696768 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.696742 2527 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:53.748714 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.748687 2527 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" Apr 22 17:53:53.760863 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.760845 2527 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:53.762527 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.762508 2527 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-21.ec2.internal" Apr 22 17:53:53.769663 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.769649 2527 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:53.795403 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:53.795373 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c86caefdc1dda70cffe5878d33236b6.slice/crio-5dae5996f14fc22c0331b919250b16284f17b386fd4102deee9ce25766e69030 WatchSource:0}: Error finding container 5dae5996f14fc22c0331b919250b16284f17b386fd4102deee9ce25766e69030: Status 404 returned error can't find the container with id 5dae5996f14fc22c0331b919250b16284f17b386fd4102deee9ce25766e69030 Apr 22 17:53:53.796068 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:53.796048 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18cf1a3f2e2dc1df47f20b03f0bf7e15.slice/crio-05edf595540557e9211f993da82e752dbb3824797a784b8373c10fbbc32a5754 WatchSource:0}: Error finding container 05edf595540557e9211f993da82e752dbb3824797a784b8373c10fbbc32a5754: Status 404 returned error can't find the container with id 05edf595540557e9211f993da82e752dbb3824797a784b8373c10fbbc32a5754 Apr 22 17:53:53.800931 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.800917 2527 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:53:53.958330 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:53.958306 2527 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:54.523181 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.523148 2527 apiserver.go:52] "Watching apiserver" Apr 22 17:53:54.532551 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.532519 2527 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:53:54.532955 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.532921 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-qbfw5","openshift-network-operator/iptables-alerter-9sxr8","openshift-ovn-kubernetes/ovnkube-node-2hklb","kube-system/konnectivity-agent-9z422","kube-system/kube-apiserver-proxy-ip-10-0-135-21.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs","openshift-image-registry/node-ca-zgpsv","openshift-multus/multus-hv7bb","openshift-cluster-node-tuning-operator/tuned-k5st4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal","openshift-multus/multus-additional-cni-plugins-q7h2l","openshift-multus/network-metrics-daemon-4pcg6"] Apr 22 17:53:54.536193 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.535778 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:53:54.536193 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.535859 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9sxr8" Apr 22 17:53:54.536193 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:54.535857 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:53:54.537462 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.537427 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.539086 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.538726 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kkmxm\"" Apr 22 17:53:54.539086 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.538838 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:53:54.539086 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.538926 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:54.539271 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.539104 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:54.540041 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.539984 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:53:54.540136 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.540057 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:53:54.540136 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.540126 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:53:54.540247 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.540235 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:53:54.540862 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.540376 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9z422" Apr 22 17:53:54.540862 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.540513 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.541800 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.541775 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:53:54.542381 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.542009 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:53:54.542381 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.542205 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-twsbx\"" Apr 22 17:53:54.543270 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.543210 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:53:54.543397 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.543319 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:53:54.543600 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.543541 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:53:54.543684 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.543546 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-p7w9b\"" Apr 22 17:53:54.543963 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.543945 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zgpsv" Apr 22 17:53:54.544110 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.544046 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.544514 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.544497 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:53:54.544615 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.544515 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-l5nc5\"" Apr 22 17:53:54.544615 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.544583 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:53:54.546278 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.546259 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:53:54.546358 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.546281 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:53:54.546587 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.546555 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:53:54.546662 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.546608 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-btfrc\"" Apr 22 17:53:54.546714 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.546664 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:53:54.546714 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.546667 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-scn75\"" Apr 22 17:53:54.546809 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.546770 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:53:54.546887 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.546867 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.547417 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.547399 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:53:54.547495 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.547398 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:53:54.549764 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.549680 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:54.549872 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.549761 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:54.549872 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.549681 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nqhxk\"" Apr 22 17:53:54.550108 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.550092 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.551464 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.551435 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:53:54.551542 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:54.551500 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:53:54.552486 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.552466 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wdwr4\"" Apr 22 17:53:54.552637 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.552619 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:53:54.552868 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.552852 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:53:54.561137 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.560853 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-tuned\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.561251 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.561172 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-tmp\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.561251 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.561222 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-conf-dir\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.562062 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562037 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5e78c417-84fe-4436-91ea-459e917e9154-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.562182 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562090 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-etc-openvswitch\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.562182 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562121 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-cni-netd\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.562182 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562149 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.562343 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562182 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-env-overrides\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.562343 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562214 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bea09b4b-5a52-4d36-968d-10c68f944bee-host\") pod \"node-ca-zgpsv\" (UID: \"bea09b4b-5a52-4d36-968d-10c68f944bee\") " pod="openshift-image-registry/node-ca-zgpsv" Apr 22 17:53:54.562343 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562248 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.562343 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562277 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-modprobe-d\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.562343 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562304 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-cnibin\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.562343 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562329 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afaf22ec-e8aa-448d-8007-91b393b66c96-cni-binary-copy\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.562681 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562358 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-run-netns\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.562681 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562387 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc9dd\" (UniqueName: \"kubernetes.io/projected/afaf22ec-e8aa-448d-8007-91b393b66c96-kube-api-access-pc9dd\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.562681 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562417 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-slash\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.562681 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562445 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-var-lib-openvswitch\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.562681 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562469 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-run-ovn\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.562681 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562497 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-ovn-node-metrics-cert\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.562681 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562537 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-daemon-config\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.562681 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562614 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.562681 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562647 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-run-netns\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.562681 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562675 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bea09b4b-5a52-4d36-968d-10c68f944bee-serviceca\") pod \"node-ca-zgpsv\" (UID: \"bea09b4b-5a52-4d36-968d-10c68f944bee\") " pod="openshift-image-registry/node-ca-zgpsv" Apr 22 17:53:54.563149 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562701 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjqrg\" (UniqueName: \"kubernetes.io/projected/78f93c71-12e7-43ba-9997-9357c2fcdd1c-kube-api-access-vjqrg\") pod \"iptables-alerter-9sxr8\" (UID: \"78f93c71-12e7-43ba-9997-9357c2fcdd1c\") " pod="openshift-network-operator/iptables-alerter-9sxr8" Apr 22 17:53:54.563149 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562728 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-cni-bin\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.563149 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562755 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-ovnkube-config\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.563149 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562784 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8872\" (UniqueName: \"kubernetes.io/projected/65197570-5734-4fef-aa1a-0880895dc55b-kube-api-access-z8872\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.563149 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562813 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-systemd\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.563149 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562836 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6db8q\" (UniqueName: \"kubernetes.io/projected/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-kube-api-access-6db8q\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.563149 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.562887 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-os-release\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.563487 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.563431 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-var-lib-cni-multus\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.563487 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.563482 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78f93c71-12e7-43ba-9997-9357c2fcdd1c-host-slash\") pod \"iptables-alerter-9sxr8\" (UID: \"78f93c71-12e7-43ba-9997-9357c2fcdd1c\") " pod="openshift-network-operator/iptables-alerter-9sxr8" Apr 22 17:53:54.563602 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.563514 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-node-log\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.563602 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.563559 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-etc-kubernetes\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.563710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.563634 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-cnibin\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.563710 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.563665 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-host\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.563807 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.563710 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-cni-dir\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.563807 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.563802 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-run-openvswitch\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.563902 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.563833 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-log-socket\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.563957 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.563911 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-socket-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.564154 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564120 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-registration-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.564255 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564196 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-kubernetes\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.564255 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564244 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-lib-modules\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.564351 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564278 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-var-lib-kubelet\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.564351 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564332 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-kubelet\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.564445 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564363 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfnqh\" (UniqueName: \"kubernetes.io/projected/bea09b4b-5a52-4d36-968d-10c68f944bee-kube-api-access-dfnqh\") pod \"node-ca-zgpsv\" (UID: \"bea09b4b-5a52-4d36-968d-10c68f944bee\") " pod="openshift-image-registry/node-ca-zgpsv" Apr 22 17:53:54.564445 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564398 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-sysconfig\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.564445 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564429 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-sysctl-conf\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.564594 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564462 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-os-release\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.564654 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564591 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hxm5\" (UniqueName: \"kubernetes.io/projected/5e78c417-84fe-4436-91ea-459e917e9154-kube-api-access-2hxm5\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.564654 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564630 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-sysctl-d\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.564741 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564657 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-system-cni-dir\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.564741 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564691 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5e78c417-84fe-4436-91ea-459e917e9154-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.564741 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564727 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/78f93c71-12e7-43ba-9997-9357c2fcdd1c-iptables-alerter-script\") pod \"iptables-alerter-9sxr8\" (UID: \"78f93c71-12e7-43ba-9997-9357c2fcdd1c\") " pod="openshift-network-operator/iptables-alerter-9sxr8" Apr 22 17:53:54.564864 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564760 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-ovnkube-script-lib\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.564864 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564793 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/22e8e337-8adf-4261-832e-41ba662e7747-konnectivity-ca\") pod \"konnectivity-agent-9z422\" (UID: \"22e8e337-8adf-4261-832e-41ba662e7747\") " pod="kube-system/konnectivity-agent-9z422" Apr 22 17:53:54.564864 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564821 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-sys-fs\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.564864 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564853 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-socket-dir-parent\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.565033 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.564901 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-hostroot\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.565082 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565052 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-systemd-units\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.565128 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565091 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-device-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.565175 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565145 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-sys\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.565222 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565199 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75rwz\" (UniqueName: \"kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz\") pod \"network-check-target-qbfw5\" (UID: \"758f2892-7f0d-40ae-a65b-7d7ce661548b\") " pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:53:54.565269 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565253 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-var-lib-kubelet\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.565313 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565285 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-run-systemd\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.565358 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565320 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.565358 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565350 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ksqf\" (UniqueName: \"kubernetes.io/projected/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-kube-api-access-7ksqf\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.565444 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565378 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-run\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.565444 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565405 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/22e8e337-8adf-4261-832e-41ba662e7747-agent-certs\") pod \"konnectivity-agent-9z422\" (UID: \"22e8e337-8adf-4261-832e-41ba662e7747\") " pod="kube-system/konnectivity-agent-9z422" Apr 22 17:53:54.565528 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565458 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-run-k8s-cni-cncf-io\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.565528 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565486 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-var-lib-cni-bin\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.565528 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565515 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-run-multus-certs\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.565678 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565543 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e78c417-84fe-4436-91ea-459e917e9154-cni-binary-copy\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.565678 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565589 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-etc-selinux\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.565678 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.565637 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-system-cni-dir\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.588947 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.588915 2527 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:53 +0000 UTC" deadline="2027-11-20 09:53:28.225571363 +0000 UTC" Apr 22 17:53:54.588947 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.588943 2527 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13839h59m33.636630468s" Apr 22 17:53:54.649614 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.649547 2527 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:53:54.666201 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666162 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-var-lib-cni-bin\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.666201 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666202 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-run-multus-certs\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.666450 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666229 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e78c417-84fe-4436-91ea-459e917e9154-cni-binary-copy\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.666450 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666255 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-etc-selinux\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.666450 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666288 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-system-cni-dir\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.666450 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666311 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-tuned\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.666450 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666337 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-tmp\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.666450 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666356 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-conf-dir\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.666450 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666378 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5e78c417-84fe-4436-91ea-459e917e9154-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.666450 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666400 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-etc-openvswitch\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.666450 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666414 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-cni-netd\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.666450 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666438 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.666968 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666474 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-env-overrides\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.666968 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666489 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bea09b4b-5a52-4d36-968d-10c68f944bee-host\") pod \"node-ca-zgpsv\" (UID: \"bea09b4b-5a52-4d36-968d-10c68f944bee\") " pod="openshift-image-registry/node-ca-zgpsv" Apr 22 17:53:54.666968 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.666538 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-conf-dir\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.667107 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667056 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5e78c417-84fe-4436-91ea-459e917e9154-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.667157 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667131 2527 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:53:54.667210 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667142 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.667210 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667192 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-modprobe-d\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.667307 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667221 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-cnibin\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.667307 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667249 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afaf22ec-e8aa-448d-8007-91b393b66c96-cni-binary-copy\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.667307 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667274 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-run-netns\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.667307 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667300 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pc9dd\" (UniqueName: \"kubernetes.io/projected/afaf22ec-e8aa-448d-8007-91b393b66c96-kube-api-access-pc9dd\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.667591 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667326 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-slash\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.667591 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667352 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-var-lib-openvswitch\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.667591 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667392 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-run-ovn\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.667591 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667407 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-ovn-node-metrics-cert\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.667591 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667423 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-daemon-config\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.667591 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667442 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.667591 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667460 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-env-overrides\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.667591 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667478 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-run-netns\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.667591 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667517 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bea09b4b-5a52-4d36-968d-10c68f944bee-serviceca\") pod \"node-ca-zgpsv\" (UID: \"bea09b4b-5a52-4d36-968d-10c68f944bee\") " pod="openshift-image-registry/node-ca-zgpsv" Apr 22 17:53:54.668071 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667601 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-var-lib-openvswitch\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.668071 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667611 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-cnibin\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.668071 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667668 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-cni-netd\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.668071 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667670 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-run-netns\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.668071 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667765 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-system-cni-dir\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.668071 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667831 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-etc-selinux\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.668071 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667954 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-run-multus-certs\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.668071 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.667966 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bea09b4b-5a52-4d36-968d-10c68f944bee-serviceca\") pod \"node-ca-zgpsv\" (UID: \"bea09b4b-5a52-4d36-968d-10c68f944bee\") " pod="openshift-image-registry/node-ca-zgpsv" Apr 22 17:53:54.668071 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668012 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-etc-openvswitch\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.668071 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668050 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-run-ovn\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.668071 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668058 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-modprobe-d\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668144 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afaf22ec-e8aa-448d-8007-91b393b66c96-cni-binary-copy\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668205 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-var-lib-cni-bin\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668200 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668239 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bea09b4b-5a52-4d36-968d-10c68f944bee-host\") pod \"node-ca-zgpsv\" (UID: \"bea09b4b-5a52-4d36-968d-10c68f944bee\") " pod="openshift-image-registry/node-ca-zgpsv" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668281 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjqrg\" (UniqueName: \"kubernetes.io/projected/78f93c71-12e7-43ba-9997-9357c2fcdd1c-kube-api-access-vjqrg\") pod \"iptables-alerter-9sxr8\" (UID: \"78f93c71-12e7-43ba-9997-9357c2fcdd1c\") " pod="openshift-network-operator/iptables-alerter-9sxr8" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668298 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e78c417-84fe-4436-91ea-459e917e9154-cni-binary-copy\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668309 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-cni-bin\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668359 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-ovnkube-config\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668363 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668407 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8872\" (UniqueName: \"kubernetes.io/projected/65197570-5734-4fef-aa1a-0880895dc55b-kube-api-access-z8872\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668443 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-systemd\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668473 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6db8q\" (UniqueName: \"kubernetes.io/projected/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-kube-api-access-6db8q\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668505 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-cni-bin\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668506 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-os-release\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668549 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-var-lib-cni-multus\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668586 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-systemd\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.668629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668595 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78f93c71-12e7-43ba-9997-9357c2fcdd1c-host-slash\") pod \"iptables-alerter-9sxr8\" (UID: \"78f93c71-12e7-43ba-9997-9357c2fcdd1c\") " pod="openshift-network-operator/iptables-alerter-9sxr8" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668622 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-node-log\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668646 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-etc-kubernetes\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668670 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-cnibin\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668679 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668700 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnp9r\" (UniqueName: \"kubernetes.io/projected/2e2c7888-11c8-4eb9-af23-d748c79bd567-kube-api-access-dnp9r\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668727 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-host\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668729 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-var-lib-cni-multus\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668443 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-run-netns\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668751 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-cni-dir\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668805 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-etc-kubernetes\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668809 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-node-log\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668808 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-daemon-config\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668858 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-cni-dir\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668878 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-cnibin\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668917 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-slash\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668925 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-host\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668927 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-run-openvswitch\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.669406 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668964 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-run-openvswitch\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.668933 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78f93c71-12e7-43ba-9997-9357c2fcdd1c-host-slash\") pod \"iptables-alerter-9sxr8\" (UID: \"78f93c71-12e7-43ba-9997-9357c2fcdd1c\") " pod="openshift-network-operator/iptables-alerter-9sxr8" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669010 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-log-socket\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669036 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-os-release\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669073 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-socket-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669101 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-registration-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669135 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-kubernetes\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669154 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-log-socket\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669194 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-registration-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669207 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-lib-modules\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669239 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-kubernetes\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669250 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-socket-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669298 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-var-lib-kubelet\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669356 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-lib-modules\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669411 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-kubelet\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669447 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfnqh\" (UniqueName: \"kubernetes.io/projected/bea09b4b-5a52-4d36-968d-10c68f944bee-kube-api-access-dfnqh\") pod \"node-ca-zgpsv\" (UID: \"bea09b4b-5a52-4d36-968d-10c68f944bee\") " pod="openshift-image-registry/node-ca-zgpsv" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669489 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-sysconfig\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.670099 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669514 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-sysctl-conf\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669546 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-os-release\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669632 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-ovnkube-config\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669667 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hxm5\" (UniqueName: \"kubernetes.io/projected/5e78c417-84fe-4436-91ea-459e917e9154-kube-api-access-2hxm5\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669691 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-kubelet\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669722 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-sysctl-d\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669750 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-system-cni-dir\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669781 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-var-lib-kubelet\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669814 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-os-release\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669848 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5e78c417-84fe-4436-91ea-459e917e9154-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669907 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669922 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e78c417-84fe-4436-91ea-459e917e9154-system-cni-dir\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669957 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-sysctl-d\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.669961 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/78f93c71-12e7-43ba-9997-9357c2fcdd1c-iptables-alerter-script\") pod \"iptables-alerter-9sxr8\" (UID: \"78f93c71-12e7-43ba-9997-9357c2fcdd1c\") " pod="openshift-network-operator/iptables-alerter-9sxr8" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670004 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-ovnkube-script-lib\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670008 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-sysconfig\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670078 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/22e8e337-8adf-4261-832e-41ba662e7747-konnectivity-ca\") pod \"konnectivity-agent-9z422\" (UID: \"22e8e337-8adf-4261-832e-41ba662e7747\") " pod="kube-system/konnectivity-agent-9z422" Apr 22 17:53:54.670784 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670134 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-sysctl-conf\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670159 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-sys-fs\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670323 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-socket-dir-parent\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670391 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-hostroot\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670411 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5e78c417-84fe-4436-91ea-459e917e9154-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670455 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-hostroot\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670454 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-systemd-units\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670496 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-systemd-units\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670514 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-device-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670578 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-sys\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670609 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-sys-fs\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670614 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75rwz\" (UniqueName: \"kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz\") pod \"network-check-target-qbfw5\" (UID: \"758f2892-7f0d-40ae-a65b-7d7ce661548b\") " pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670645 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-ovnkube-script-lib\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670661 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65197570-5734-4fef-aa1a-0880895dc55b-device-dir\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670640 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/22e8e337-8adf-4261-832e-41ba662e7747-konnectivity-ca\") pod \"konnectivity-agent-9z422\" (UID: \"22e8e337-8adf-4261-832e-41ba662e7747\") " pod="kube-system/konnectivity-agent-9z422" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670717 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-sys\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670721 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-var-lib-kubelet\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.671560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670731 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-multus-socket-dir-parent\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670855 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-run-systemd\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670858 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-var-lib-kubelet\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670888 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670933 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-run-systemd\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670935 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.670964 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ksqf\" (UniqueName: \"kubernetes.io/projected/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-kube-api-access-7ksqf\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.671001 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-run\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.671034 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/22e8e337-8adf-4261-832e-41ba662e7747-agent-certs\") pod \"konnectivity-agent-9z422\" (UID: \"22e8e337-8adf-4261-832e-41ba662e7747\") " pod="kube-system/konnectivity-agent-9z422" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.671081 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-run\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.671182 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-run-k8s-cni-cncf-io\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.671275 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/afaf22ec-e8aa-448d-8007-91b393b66c96-host-run-k8s-cni-cncf-io\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.672348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.671408 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/78f93c71-12e7-43ba-9997-9357c2fcdd1c-iptables-alerter-script\") pod \"iptables-alerter-9sxr8\" (UID: \"78f93c71-12e7-43ba-9997-9357c2fcdd1c\") " pod="openshift-network-operator/iptables-alerter-9sxr8" Apr 22 17:53:54.672986 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.672965 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-etc-tuned\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.673213 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.673191 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-tmp\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.673412 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.673392 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-ovn-node-metrics-cert\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.674617 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.674595 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/22e8e337-8adf-4261-832e-41ba662e7747-agent-certs\") pod \"konnectivity-agent-9z422\" (UID: \"22e8e337-8adf-4261-832e-41ba662e7747\") " pod="kube-system/konnectivity-agent-9z422" Apr 22 17:53:54.678552 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:54.677796 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:54.678552 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:54.677820 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:54.678552 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:54.677833 2527 projected.go:194] Error preparing data for projected volume kube-api-access-75rwz for pod openshift-network-diagnostics/network-check-target-qbfw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:54.678552 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:54.677980 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz podName:758f2892-7f0d-40ae-a65b-7d7ce661548b nodeName:}" failed. No retries permitted until 2026-04-22 17:53:55.17790831 +0000 UTC m=+3.141197358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-75rwz" (UniqueName: "kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz") pod "network-check-target-qbfw5" (UID: "758f2892-7f0d-40ae-a65b-7d7ce661548b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:54.678552 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.678362 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc9dd\" (UniqueName: \"kubernetes.io/projected/afaf22ec-e8aa-448d-8007-91b393b66c96-kube-api-access-pc9dd\") pod \"multus-hv7bb\" (UID: \"afaf22ec-e8aa-448d-8007-91b393b66c96\") " pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.678868 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.678610 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjqrg\" (UniqueName: \"kubernetes.io/projected/78f93c71-12e7-43ba-9997-9357c2fcdd1c-kube-api-access-vjqrg\") pod \"iptables-alerter-9sxr8\" (UID: \"78f93c71-12e7-43ba-9997-9357c2fcdd1c\") " pod="openshift-network-operator/iptables-alerter-9sxr8" Apr 22 17:53:54.679049 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.679025 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfnqh\" (UniqueName: \"kubernetes.io/projected/bea09b4b-5a52-4d36-968d-10c68f944bee-kube-api-access-dfnqh\") pod \"node-ca-zgpsv\" (UID: \"bea09b4b-5a52-4d36-968d-10c68f944bee\") " pod="openshift-image-registry/node-ca-zgpsv" Apr 22 17:53:54.679617 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.679557 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8872\" (UniqueName: \"kubernetes.io/projected/65197570-5734-4fef-aa1a-0880895dc55b-kube-api-access-z8872\") pod \"aws-ebs-csi-driver-node-2d8fs\" (UID: \"65197570-5734-4fef-aa1a-0880895dc55b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.680119 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.680084 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6db8q\" (UniqueName: \"kubernetes.io/projected/720f417e-e0c4-40a9-a38a-c8e1907dbe5b-kube-api-access-6db8q\") pod \"tuned-k5st4\" (UID: \"720f417e-e0c4-40a9-a38a-c8e1907dbe5b\") " pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.680842 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.680812 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hxm5\" (UniqueName: \"kubernetes.io/projected/5e78c417-84fe-4436-91ea-459e917e9154-kube-api-access-2hxm5\") pod \"multus-additional-cni-plugins-q7h2l\" (UID: \"5e78c417-84fe-4436-91ea-459e917e9154\") " pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:54.681953 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.681935 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ksqf\" (UniqueName: \"kubernetes.io/projected/b7681f6e-b80a-4387-bdf9-03efc8cdbd1c-kube-api-access-7ksqf\") pod \"ovnkube-node-2hklb\" (UID: \"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.689094 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.689074 2527 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:54.698185 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.698127 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-21.ec2.internal" event={"ID":"7c86caefdc1dda70cffe5878d33236b6","Type":"ContainerStarted","Data":"5dae5996f14fc22c0331b919250b16284f17b386fd4102deee9ce25766e69030"} Apr 22 17:53:54.699298 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.699254 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" event={"ID":"18cf1a3f2e2dc1df47f20b03f0bf7e15","Type":"ContainerStarted","Data":"05edf595540557e9211f993da82e752dbb3824797a784b8373c10fbbc32a5754"} Apr 22 17:53:54.771535 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.771500 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnp9r\" (UniqueName: \"kubernetes.io/projected/2e2c7888-11c8-4eb9-af23-d748c79bd567-kube-api-access-dnp9r\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:53:54.771727 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.771588 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:53:54.771727 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:54.771708 2527 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:54.771830 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:54.771767 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs podName:2e2c7888-11c8-4eb9-af23-d748c79bd567 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:55.271750583 +0000 UTC m=+3.235039644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs") pod "network-metrics-daemon-4pcg6" (UID: "2e2c7888-11c8-4eb9-af23-d748c79bd567") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:54.782807 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.782733 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnp9r\" (UniqueName: \"kubernetes.io/projected/2e2c7888-11c8-4eb9-af23-d748c79bd567-kube-api-access-dnp9r\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:53:54.850687 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.850656 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9sxr8" Apr 22 17:53:54.858546 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.858517 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:53:54.868307 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.868287 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9z422" Apr 22 17:53:54.876677 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.876653 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" Apr 22 17:53:54.884207 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.884188 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zgpsv" Apr 22 17:53:54.892818 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.892799 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hv7bb" Apr 22 17:53:54.898358 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.898339 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-k5st4" Apr 22 17:53:54.903852 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:54.903833 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q7h2l" Apr 22 17:53:55.275271 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.275229 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:53:55.275459 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.275285 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75rwz\" (UniqueName: \"kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz\") pod \"network-check-target-qbfw5\" (UID: \"758f2892-7f0d-40ae-a65b-7d7ce661548b\") " pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:53:55.275459 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:55.275417 2527 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:55.275459 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:55.275430 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:55.275459 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:55.275445 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:55.275459 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:55.275455 2527 projected.go:194] Error preparing data for projected volume kube-api-access-75rwz for pod openshift-network-diagnostics/network-check-target-qbfw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:55.275690 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:55.275505 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs podName:2e2c7888-11c8-4eb9-af23-d748c79bd567 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:56.275481282 +0000 UTC m=+4.238770348 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs") pod "network-metrics-daemon-4pcg6" (UID: "2e2c7888-11c8-4eb9-af23-d748c79bd567") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:55.275690 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:55.275526 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz podName:758f2892-7f0d-40ae-a65b-7d7ce661548b nodeName:}" failed. No retries permitted until 2026-04-22 17:53:56.275516022 +0000 UTC m=+4.238805076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-75rwz" (UniqueName: "kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz") pod "network-check-target-qbfw5" (UID: "758f2892-7f0d-40ae-a65b-7d7ce661548b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:55.412015 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:55.411901 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7681f6e_b80a_4387_bdf9_03efc8cdbd1c.slice/crio-03a848c2348b231bb0bf5afeef6aea719692495372f8065931214cde1d1c011b WatchSource:0}: Error finding container 03a848c2348b231bb0bf5afeef6aea719692495372f8065931214cde1d1c011b: Status 404 returned error can't find the container with id 03a848c2348b231bb0bf5afeef6aea719692495372f8065931214cde1d1c011b Apr 22 17:53:55.412812 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:55.412788 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbea09b4b_5a52_4d36_968d_10c68f944bee.slice/crio-cb96c57cff1b964c2ebb5dd2d1774c506d4df18f218db4559c55230f54a268ec WatchSource:0}: Error finding container cb96c57cff1b964c2ebb5dd2d1774c506d4df18f218db4559c55230f54a268ec: Status 404 returned error can't find the container with id cb96c57cff1b964c2ebb5dd2d1774c506d4df18f218db4559c55230f54a268ec Apr 22 17:53:55.413607 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:55.413579 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65197570_5734_4fef_aa1a_0880895dc55b.slice/crio-83f959b8954dffb85c0c6078e9848677498a1fc5a3672c32b872d874d3b4bd58 WatchSource:0}: Error finding container 83f959b8954dffb85c0c6078e9848677498a1fc5a3672c32b872d874d3b4bd58: Status 404 returned error can't find the container with id 83f959b8954dffb85c0c6078e9848677498a1fc5a3672c32b872d874d3b4bd58 Apr 22 17:53:55.419754 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:55.419724 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78f93c71_12e7_43ba_9997_9357c2fcdd1c.slice/crio-3610ab0fec224269a675f6b7353532e145e440dd969ff868925ba61ed68b2ddc WatchSource:0}: Error finding container 3610ab0fec224269a675f6b7353532e145e440dd969ff868925ba61ed68b2ddc: Status 404 returned error can't find the container with id 3610ab0fec224269a675f6b7353532e145e440dd969ff868925ba61ed68b2ddc Apr 22 17:53:55.421762 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:55.421730 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e78c417_84fe_4436_91ea_459e917e9154.slice/crio-eafd0e7593fc9432cd18fa074f28b91f519a5c6de32f47a4767def516f52e56e WatchSource:0}: Error finding container eafd0e7593fc9432cd18fa074f28b91f519a5c6de32f47a4767def516f52e56e: Status 404 returned error can't find the container with id eafd0e7593fc9432cd18fa074f28b91f519a5c6de32f47a4767def516f52e56e Apr 22 17:53:55.422220 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:55.422175 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafaf22ec_e8aa_448d_8007_91b393b66c96.slice/crio-5f52d1d7bfa88748b650311476e7cea790667d187f8f4bcb2dec009b8105d217 WatchSource:0}: Error finding container 5f52d1d7bfa88748b650311476e7cea790667d187f8f4bcb2dec009b8105d217: Status 404 returned error can't find the container with id 5f52d1d7bfa88748b650311476e7cea790667d187f8f4bcb2dec009b8105d217 Apr 22 17:53:55.465525 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.465496 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qqd9n"] Apr 22 17:53:55.467687 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.467665 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qqd9n" Apr 22 17:53:55.470151 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.470130 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:53:55.470282 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.470176 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:53:55.470348 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.470299 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2cspk\"" Apr 22 17:53:55.577925 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.577831 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71ff672a-e8ff-4269-8be1-4e451a7046df-tmp-dir\") pod \"node-resolver-qqd9n\" (UID: \"71ff672a-e8ff-4269-8be1-4e451a7046df\") " pod="openshift-dns/node-resolver-qqd9n" Apr 22 17:53:55.577925 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.577918 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71ff672a-e8ff-4269-8be1-4e451a7046df-hosts-file\") pod \"node-resolver-qqd9n\" (UID: \"71ff672a-e8ff-4269-8be1-4e451a7046df\") " pod="openshift-dns/node-resolver-qqd9n" Apr 22 17:53:55.578308 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.577953 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tv8j\" (UniqueName: \"kubernetes.io/projected/71ff672a-e8ff-4269-8be1-4e451a7046df-kube-api-access-6tv8j\") pod \"node-resolver-qqd9n\" (UID: \"71ff672a-e8ff-4269-8be1-4e451a7046df\") " pod="openshift-dns/node-resolver-qqd9n" Apr 22 17:53:55.589873 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.589841 2527 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:53 +0000 UTC" deadline="2027-09-19 01:05:28.222256353 +0000 UTC" Apr 22 17:53:55.589873 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.589866 2527 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12343h11m32.632392757s" Apr 22 17:53:55.678398 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.678368 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71ff672a-e8ff-4269-8be1-4e451a7046df-hosts-file\") pod \"node-resolver-qqd9n\" (UID: \"71ff672a-e8ff-4269-8be1-4e451a7046df\") " pod="openshift-dns/node-resolver-qqd9n" Apr 22 17:53:55.678544 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.678412 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tv8j\" (UniqueName: \"kubernetes.io/projected/71ff672a-e8ff-4269-8be1-4e451a7046df-kube-api-access-6tv8j\") pod \"node-resolver-qqd9n\" (UID: \"71ff672a-e8ff-4269-8be1-4e451a7046df\") " pod="openshift-dns/node-resolver-qqd9n" Apr 22 17:53:55.678544 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.678455 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71ff672a-e8ff-4269-8be1-4e451a7046df-tmp-dir\") pod \"node-resolver-qqd9n\" (UID: \"71ff672a-e8ff-4269-8be1-4e451a7046df\") " pod="openshift-dns/node-resolver-qqd9n" Apr 22 17:53:55.678544 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.678500 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71ff672a-e8ff-4269-8be1-4e451a7046df-hosts-file\") pod \"node-resolver-qqd9n\" (UID: \"71ff672a-e8ff-4269-8be1-4e451a7046df\") " pod="openshift-dns/node-resolver-qqd9n" Apr 22 17:53:55.678857 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.678837 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71ff672a-e8ff-4269-8be1-4e451a7046df-tmp-dir\") pod \"node-resolver-qqd9n\" (UID: \"71ff672a-e8ff-4269-8be1-4e451a7046df\") " pod="openshift-dns/node-resolver-qqd9n" Apr 22 17:53:55.689029 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.689002 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tv8j\" (UniqueName: \"kubernetes.io/projected/71ff672a-e8ff-4269-8be1-4e451a7046df-kube-api-access-6tv8j\") pod \"node-resolver-qqd9n\" (UID: \"71ff672a-e8ff-4269-8be1-4e451a7046df\") " pod="openshift-dns/node-resolver-qqd9n" Apr 22 17:53:55.702453 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.702427 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-21.ec2.internal" event={"ID":"7c86caefdc1dda70cffe5878d33236b6","Type":"ContainerStarted","Data":"9c757230860b3e7dee6650ffefe4b55748562ba08702a4e3ea6b7455ebd0cb0b"} Apr 22 17:53:55.704283 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.704255 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hv7bb" event={"ID":"afaf22ec-e8aa-448d-8007-91b393b66c96","Type":"ContainerStarted","Data":"5f52d1d7bfa88748b650311476e7cea790667d187f8f4bcb2dec009b8105d217"} Apr 22 17:53:55.705228 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.705210 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7h2l" event={"ID":"5e78c417-84fe-4436-91ea-459e917e9154","Type":"ContainerStarted","Data":"eafd0e7593fc9432cd18fa074f28b91f519a5c6de32f47a4767def516f52e56e"} Apr 22 17:53:55.706195 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.706174 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-k5st4" event={"ID":"720f417e-e0c4-40a9-a38a-c8e1907dbe5b","Type":"ContainerStarted","Data":"5336ff1b81048a9232b277832411f81974d36700ca25253ca3f40ec75256657b"} Apr 22 17:53:55.707038 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.707022 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" event={"ID":"65197570-5734-4fef-aa1a-0880895dc55b","Type":"ContainerStarted","Data":"83f959b8954dffb85c0c6078e9848677498a1fc5a3672c32b872d874d3b4bd58"} Apr 22 17:53:55.707985 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.707964 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zgpsv" event={"ID":"bea09b4b-5a52-4d36-968d-10c68f944bee","Type":"ContainerStarted","Data":"cb96c57cff1b964c2ebb5dd2d1774c506d4df18f218db4559c55230f54a268ec"} Apr 22 17:53:55.711328 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.711309 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" event={"ID":"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c","Type":"ContainerStarted","Data":"03a848c2348b231bb0bf5afeef6aea719692495372f8065931214cde1d1c011b"} Apr 22 17:53:55.713827 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.713800 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9z422" event={"ID":"22e8e337-8adf-4261-832e-41ba662e7747","Type":"ContainerStarted","Data":"8d35222c21d5f0320067ad46fe98cd861e5b3dee2e85e5f3bd243fba3dfd6069"} Apr 22 17:53:55.717207 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.717182 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9sxr8" event={"ID":"78f93c71-12e7-43ba-9997-9357c2fcdd1c","Type":"ContainerStarted","Data":"3610ab0fec224269a675f6b7353532e145e440dd969ff868925ba61ed68b2ddc"} Apr 22 17:53:55.717283 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.717214 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-21.ec2.internal" podStartSLOduration=2.717203084 podStartE2EDuration="2.717203084s" podCreationTimestamp="2026-04-22 17:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:55.716807101 +0000 UTC m=+3.680096175" watchObservedRunningTime="2026-04-22 17:53:55.717203084 +0000 UTC m=+3.680492154" Apr 22 17:53:55.790437 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:55.790196 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qqd9n" Apr 22 17:53:55.799066 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:53:55.799030 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71ff672a_e8ff_4269_8be1_4e451a7046df.slice/crio-1170b732c836a6f8eea8480f85694ca21badf75cde270b7c29bb47c02c4274e6 WatchSource:0}: Error finding container 1170b732c836a6f8eea8480f85694ca21badf75cde270b7c29bb47c02c4274e6: Status 404 returned error can't find the container with id 1170b732c836a6f8eea8480f85694ca21badf75cde270b7c29bb47c02c4274e6 Apr 22 17:53:56.283929 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:56.283897 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:53:56.284088 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:56.283952 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75rwz\" (UniqueName: \"kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz\") pod \"network-check-target-qbfw5\" (UID: \"758f2892-7f0d-40ae-a65b-7d7ce661548b\") " pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:53:56.284148 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:56.284093 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:56.284148 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:56.284113 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:56.284148 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:56.284126 2527 projected.go:194] Error preparing data for projected volume kube-api-access-75rwz for pod openshift-network-diagnostics/network-check-target-qbfw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:56.284293 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:56.284181 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz podName:758f2892-7f0d-40ae-a65b-7d7ce661548b nodeName:}" failed. No retries permitted until 2026-04-22 17:53:58.284161205 +0000 UTC m=+6.247450260 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-75rwz" (UniqueName: "kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz") pod "network-check-target-qbfw5" (UID: "758f2892-7f0d-40ae-a65b-7d7ce661548b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:56.284623 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:56.284603 2527 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:56.284701 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:56.284666 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs podName:2e2c7888-11c8-4eb9-af23-d748c79bd567 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:58.284648574 +0000 UTC m=+6.247937625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs") pod "network-metrics-daemon-4pcg6" (UID: "2e2c7888-11c8-4eb9-af23-d748c79bd567") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:56.694755 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:56.694724 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:53:56.695208 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:56.694726 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:53:56.695208 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:56.694851 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:53:56.695208 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:56.694941 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:53:56.733609 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:56.733550 2527 generic.go:358] "Generic (PLEG): container finished" podID="18cf1a3f2e2dc1df47f20b03f0bf7e15" containerID="db834e7c8acaaa7f164f47e1ada677da6fc102b344b61394868a96ebfc4345a9" exitCode=0 Apr 22 17:53:56.733766 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:56.733661 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" event={"ID":"18cf1a3f2e2dc1df47f20b03f0bf7e15","Type":"ContainerDied","Data":"db834e7c8acaaa7f164f47e1ada677da6fc102b344b61394868a96ebfc4345a9"} Apr 22 17:53:56.747053 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:56.747024 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qqd9n" event={"ID":"71ff672a-e8ff-4269-8be1-4e451a7046df","Type":"ContainerStarted","Data":"1170b732c836a6f8eea8480f85694ca21badf75cde270b7c29bb47c02c4274e6"} Apr 22 17:53:57.761704 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:57.761665 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" event={"ID":"18cf1a3f2e2dc1df47f20b03f0bf7e15","Type":"ContainerStarted","Data":"6522cd9ceee2a887a504415db3d247f438081349d44a493fc59c8018882e0e4d"} Apr 22 17:53:57.776299 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:57.776243 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-21.ec2.internal" podStartSLOduration=4.77622113 podStartE2EDuration="4.77622113s" podCreationTimestamp="2026-04-22 17:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:57.775708737 +0000 UTC m=+5.738997808" watchObservedRunningTime="2026-04-22 17:53:57.77622113 +0000 UTC m=+5.739510200" Apr 22 17:53:57.948721 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:57.948636 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rh7h6"] Apr 22 17:53:57.950810 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:57.950787 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:57.950926 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:57.950853 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:53:58.000303 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.000268 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:58.000471 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.000314 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4424ac43-8016-4c60-8b3b-e1b48a277a63-dbus\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:58.000471 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.000342 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4424ac43-8016-4c60-8b3b-e1b48a277a63-kubelet-config\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:58.100889 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.100853 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:58.101059 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.100903 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4424ac43-8016-4c60-8b3b-e1b48a277a63-dbus\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:58.101059 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.100931 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4424ac43-8016-4c60-8b3b-e1b48a277a63-kubelet-config\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:58.101059 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.101019 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4424ac43-8016-4c60-8b3b-e1b48a277a63-kubelet-config\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:58.101221 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.101118 2527 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:58.101221 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.101176 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret podName:4424ac43-8016-4c60-8b3b-e1b48a277a63 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:58.601157229 +0000 UTC m=+6.564446280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret") pod "global-pull-secret-syncer-rh7h6" (UID: "4424ac43-8016-4c60-8b3b-e1b48a277a63") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:58.101615 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.101591 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4424ac43-8016-4c60-8b3b-e1b48a277a63-dbus\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:58.303379 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.303343 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75rwz\" (UniqueName: \"kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz\") pod \"network-check-target-qbfw5\" (UID: \"758f2892-7f0d-40ae-a65b-7d7ce661548b\") " pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:53:58.303549 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.303438 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:53:58.303647 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.303554 2527 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:58.303647 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.303627 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs podName:2e2c7888-11c8-4eb9-af23-d748c79bd567 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:02.303608486 +0000 UTC m=+10.266897537 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs") pod "network-metrics-daemon-4pcg6" (UID: "2e2c7888-11c8-4eb9-af23-d748c79bd567") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:58.303755 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.303711 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:58.303755 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.303725 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:58.303755 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.303738 2527 projected.go:194] Error preparing data for projected volume kube-api-access-75rwz for pod openshift-network-diagnostics/network-check-target-qbfw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:58.303898 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.303770 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz podName:758f2892-7f0d-40ae-a65b-7d7ce661548b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:02.303759243 +0000 UTC m=+10.267048298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-75rwz" (UniqueName: "kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz") pod "network-check-target-qbfw5" (UID: "758f2892-7f0d-40ae-a65b-7d7ce661548b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:58.606556 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.606404 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:58.606556 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.606532 2527 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:58.606775 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.606614 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret podName:4424ac43-8016-4c60-8b3b-e1b48a277a63 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:59.606594947 +0000 UTC m=+7.569884004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret") pod "global-pull-secret-syncer-rh7h6" (UID: "4424ac43-8016-4c60-8b3b-e1b48a277a63") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:58.693733 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.693620 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:53:58.693910 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.693754 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:53:58.694064 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:58.694043 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:53:58.694198 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:58.694171 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:53:59.615897 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:59.615850 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:59.616334 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:59.615998 2527 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:59.616334 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:59.616069 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret podName:4424ac43-8016-4c60-8b3b-e1b48a277a63 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:01.616050167 +0000 UTC m=+9.579339220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret") pod "global-pull-secret-syncer-rh7h6" (UID: "4424ac43-8016-4c60-8b3b-e1b48a277a63") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:59.694283 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:53:59.693765 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:53:59.694283 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:53:59.693915 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:00.694200 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:00.694164 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:00.694676 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:00.694298 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:00.694776 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:00.694740 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:00.694871 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:00.694854 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:01.632799 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:01.632238 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:01.632799 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:01.632374 2527 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:54:01.632799 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:01.632438 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret podName:4424ac43-8016-4c60-8b3b-e1b48a277a63 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:05.632421461 +0000 UTC m=+13.595710515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret") pod "global-pull-secret-syncer-rh7h6" (UID: "4424ac43-8016-4c60-8b3b-e1b48a277a63") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:54:01.694421 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:01.694385 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:01.694887 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:01.694527 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:02.338126 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:02.338059 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:02.338126 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:02.338113 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75rwz\" (UniqueName: \"kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz\") pod \"network-check-target-qbfw5\" (UID: \"758f2892-7f0d-40ae-a65b-7d7ce661548b\") " pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:02.338400 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:02.338235 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:54:02.338400 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:02.338227 2527 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:02.338400 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:02.338255 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:54:02.338400 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:02.338268 2527 projected.go:194] Error preparing data for projected volume kube-api-access-75rwz for pod openshift-network-diagnostics/network-check-target-qbfw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:02.338400 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:02.338314 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs podName:2e2c7888-11c8-4eb9-af23-d748c79bd567 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:10.338292542 +0000 UTC m=+18.301581606 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs") pod "network-metrics-daemon-4pcg6" (UID: "2e2c7888-11c8-4eb9-af23-d748c79bd567") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:02.338400 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:02.338335 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz podName:758f2892-7f0d-40ae-a65b-7d7ce661548b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:10.338323655 +0000 UTC m=+18.301612707 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-75rwz" (UniqueName: "kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz") pod "network-check-target-qbfw5" (UID: "758f2892-7f0d-40ae-a65b-7d7ce661548b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:02.695467 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:02.695388 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:02.696038 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:02.695533 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:02.696038 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:02.695618 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:02.696038 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:02.695737 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:03.694088 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:03.694053 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:03.694267 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:03.694178 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:04.694008 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:04.693764 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:04.694008 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:04.693825 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:04.694008 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:04.693932 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:04.694512 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:04.694058 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:05.668066 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:05.668028 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:05.668274 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:05.668153 2527 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:54:05.668274 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:05.668224 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret podName:4424ac43-8016-4c60-8b3b-e1b48a277a63 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:13.668205258 +0000 UTC m=+21.631494309 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret") pod "global-pull-secret-syncer-rh7h6" (UID: "4424ac43-8016-4c60-8b3b-e1b48a277a63") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:54:05.693725 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:05.693689 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:05.693889 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:05.693822 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:06.693713 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:06.693677 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:06.694181 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:06.693676 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:06.694181 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:06.693821 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:06.694181 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:06.693905 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:07.693699 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:07.693659 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:07.693884 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:07.693810 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:08.696493 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:08.696457 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:08.696919 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:08.696457 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:08.696919 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:08.696614 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:08.696919 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:08.696689 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:09.694056 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:09.694025 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:09.694224 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:09.694125 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:10.402240 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:10.402210 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:10.402240 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:10.402248 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75rwz\" (UniqueName: \"kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz\") pod \"network-check-target-qbfw5\" (UID: \"758f2892-7f0d-40ae-a65b-7d7ce661548b\") " pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:10.402816 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:10.402357 2527 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:10.402816 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:10.402422 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs podName:2e2c7888-11c8-4eb9-af23-d748c79bd567 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:26.402407535 +0000 UTC m=+34.365696589 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs") pod "network-metrics-daemon-4pcg6" (UID: "2e2c7888-11c8-4eb9-af23-d748c79bd567") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:10.402816 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:10.402362 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:54:10.402816 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:10.402454 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:54:10.402816 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:10.402466 2527 projected.go:194] Error preparing data for projected volume kube-api-access-75rwz for pod openshift-network-diagnostics/network-check-target-qbfw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:10.402816 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:10.402517 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz podName:758f2892-7f0d-40ae-a65b-7d7ce661548b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:26.402500457 +0000 UTC m=+34.365789536 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-75rwz" (UniqueName: "kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz") pod "network-check-target-qbfw5" (UID: "758f2892-7f0d-40ae-a65b-7d7ce661548b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:10.694016 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:10.693935 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:10.694016 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:10.693973 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:10.694204 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:10.694046 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:10.694242 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:10.694192 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:11.694078 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:11.694042 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:11.694538 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:11.694170 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:12.695503 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.695115 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:12.696024 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:12.695617 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:12.696024 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.695212 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:12.696024 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:12.695936 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:12.793912 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.793855 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 17:54:12.794188 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.794168 2527 generic.go:358] "Generic (PLEG): container finished" podID="b7681f6e-b80a-4387-bdf9-03efc8cdbd1c" containerID="f40d6166a6f88a8ea4010df5a81da0f4102681ee311cae670412b3e87b8533a1" exitCode=1 Apr 22 17:54:12.794266 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.794239 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" event={"ID":"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c","Type":"ContainerStarted","Data":"c23e029c28ba582654c8b019adce5465a32855774ed5847656df1e8fdc8eb25b"} Apr 22 17:54:12.794328 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.794265 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" event={"ID":"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c","Type":"ContainerStarted","Data":"a2ab35cdae1f62f7b108d99c27ee545144ab0cda2d0d72a13b1971b490e1e014"} Apr 22 17:54:12.794328 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.794278 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" event={"ID":"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c","Type":"ContainerStarted","Data":"86e6d75a74e82f9c9fc3f1b6c758a53510edadfea72fb9ecc42081d32ceafbca"} Apr 22 17:54:12.794328 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.794292 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" event={"ID":"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c","Type":"ContainerDied","Data":"f40d6166a6f88a8ea4010df5a81da0f4102681ee311cae670412b3e87b8533a1"} Apr 22 17:54:12.794328 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.794306 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" event={"ID":"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c","Type":"ContainerStarted","Data":"5e512beecc649226b2459926838e41928b1e7dc24ae204e2b999d05bd3ee9f1d"} Apr 22 17:54:12.795489 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.795434 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9z422" event={"ID":"22e8e337-8adf-4261-832e-41ba662e7747","Type":"ContainerStarted","Data":"022a68a4d86258fb9da0051969cde6258344528cf09b949ff436f77650a275f1"} Apr 22 17:54:12.796802 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.796778 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qqd9n" event={"ID":"71ff672a-e8ff-4269-8be1-4e451a7046df","Type":"ContainerStarted","Data":"c6632567dafde59f6710c78b88577d521bf0b0990710b0b66130810f1f170d27"} Apr 22 17:54:12.798144 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.798120 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hv7bb" event={"ID":"afaf22ec-e8aa-448d-8007-91b393b66c96","Type":"ContainerStarted","Data":"1b94e227e032980f986650cf71c0ee9290f57706c15f7459674365d156a053ea"} Apr 22 17:54:12.799471 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.799451 2527 generic.go:358] "Generic (PLEG): container finished" podID="5e78c417-84fe-4436-91ea-459e917e9154" containerID="b4f7304a2eda3674be4a6692b2e004f611aea55189f79ced458e2df0abe53afd" exitCode=0 Apr 22 17:54:12.799560 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.799497 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7h2l" event={"ID":"5e78c417-84fe-4436-91ea-459e917e9154","Type":"ContainerDied","Data":"b4f7304a2eda3674be4a6692b2e004f611aea55189f79ced458e2df0abe53afd"} Apr 22 17:54:12.801090 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.801062 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-k5st4" event={"ID":"720f417e-e0c4-40a9-a38a-c8e1907dbe5b","Type":"ContainerStarted","Data":"a53dc97c9706c7fc3b71073e784558dfc9dc130749e729afa178123dbe7f5ebf"} Apr 22 17:54:12.803038 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.803011 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" event={"ID":"65197570-5734-4fef-aa1a-0880895dc55b","Type":"ContainerStarted","Data":"7ba59483a4c4623f7ae6e7b43817f25c8816e17ac3a42a57e3549a8fe1874ffe"} Apr 22 17:54:12.805357 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.805328 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zgpsv" event={"ID":"bea09b4b-5a52-4d36-968d-10c68f944bee","Type":"ContainerStarted","Data":"9693f3f2cd21d9c10c65ba63ce6853a7f89e2197e66b864d3b347a2e5117ea43"} Apr 22 17:54:12.840519 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.840478 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zgpsv" podStartSLOduration=4.214898013 podStartE2EDuration="20.840463754s" podCreationTimestamp="2026-04-22 17:53:52 +0000 UTC" firstStartedPulling="2026-04-22 17:53:55.415288605 +0000 UTC m=+3.378577656" lastFinishedPulling="2026-04-22 17:54:12.040854343 +0000 UTC m=+20.004143397" observedRunningTime="2026-04-22 17:54:12.840000194 +0000 UTC m=+20.803289275" watchObservedRunningTime="2026-04-22 17:54:12.840463754 +0000 UTC m=+20.803752824" Apr 22 17:54:12.854248 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.854208 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qqd9n" podStartSLOduration=1.615281781 podStartE2EDuration="17.854193303s" podCreationTimestamp="2026-04-22 17:53:55 +0000 UTC" firstStartedPulling="2026-04-22 17:53:55.801800435 +0000 UTC m=+3.765089500" lastFinishedPulling="2026-04-22 17:54:12.040711971 +0000 UTC m=+20.004001022" observedRunningTime="2026-04-22 17:54:12.853797955 +0000 UTC m=+20.817087024" watchObservedRunningTime="2026-04-22 17:54:12.854193303 +0000 UTC m=+20.817482374" Apr 22 17:54:12.867552 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.867511 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9z422" podStartSLOduration=4.248216669 podStartE2EDuration="20.867496015s" podCreationTimestamp="2026-04-22 17:53:52 +0000 UTC" firstStartedPulling="2026-04-22 17:53:55.42143953 +0000 UTC m=+3.384728596" lastFinishedPulling="2026-04-22 17:54:12.04071888 +0000 UTC m=+20.004007942" observedRunningTime="2026-04-22 17:54:12.867351328 +0000 UTC m=+20.830640397" watchObservedRunningTime="2026-04-22 17:54:12.867496015 +0000 UTC m=+20.830785084" Apr 22 17:54:12.884958 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.884914 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hv7bb" podStartSLOduration=4.128637754 podStartE2EDuration="20.884901044s" podCreationTimestamp="2026-04-22 17:53:52 +0000 UTC" firstStartedPulling="2026-04-22 17:53:55.424259366 +0000 UTC m=+3.387548419" lastFinishedPulling="2026-04-22 17:54:12.180522646 +0000 UTC m=+20.143811709" observedRunningTime="2026-04-22 17:54:12.884383938 +0000 UTC m=+20.847673035" watchObservedRunningTime="2026-04-22 17:54:12.884901044 +0000 UTC m=+20.848190114" Apr 22 17:54:12.901508 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:12.901447 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-k5st4" podStartSLOduration=4.2701406 podStartE2EDuration="20.90142641s" podCreationTimestamp="2026-04-22 17:53:52 +0000 UTC" firstStartedPulling="2026-04-22 17:53:55.419101973 +0000 UTC m=+3.382391032" lastFinishedPulling="2026-04-22 17:54:12.050387795 +0000 UTC m=+20.013676842" observedRunningTime="2026-04-22 17:54:12.900399624 +0000 UTC m=+20.863688694" watchObservedRunningTime="2026-04-22 17:54:12.90142641 +0000 UTC m=+20.864715481" Apr 22 17:54:13.301033 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:13.300991 2527 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:54:13.634787 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:13.634678 2527 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:54:13.301017942Z","UUID":"bf7b8e66-3b85-4d32-bb79-9bc13314be98","Handler":null,"Name":"","Endpoint":""} Apr 22 17:54:13.637883 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:13.637861 2527 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:54:13.638007 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:13.637891 2527 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:54:13.694145 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:13.694113 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:13.694309 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:13.694246 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:13.729646 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:13.729554 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:13.730445 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:13.729719 2527 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:54:13.730445 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:13.729798 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret podName:4424ac43-8016-4c60-8b3b-e1b48a277a63 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:29.729780022 +0000 UTC m=+37.693069070 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret") pod "global-pull-secret-syncer-rh7h6" (UID: "4424ac43-8016-4c60-8b3b-e1b48a277a63") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:54:13.809523 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:13.809476 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" event={"ID":"65197570-5734-4fef-aa1a-0880895dc55b","Type":"ContainerStarted","Data":"27eec0e35d48aab62f45404c1d14f899d3a1910609cdf60f3e482788eba05454"} Apr 22 17:54:13.812387 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:13.812357 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 17:54:13.812761 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:13.812735 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" event={"ID":"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c","Type":"ContainerStarted","Data":"e5ac28119bd1158f3e342bad9157e15d95c1902d03cdb3e9bc2c8db4d5ce40ea"} Apr 22 17:54:13.814500 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:13.814476 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9sxr8" event={"ID":"78f93c71-12e7-43ba-9997-9357c2fcdd1c","Type":"ContainerStarted","Data":"3ec7a7d7aedf89f86d04e2b33453d79ae3830601414a21acaea975d81388f872"} Apr 22 17:54:13.831391 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:13.831336 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9sxr8" podStartSLOduration=5.21297369 podStartE2EDuration="21.831321923s" podCreationTimestamp="2026-04-22 17:53:52 +0000 UTC" firstStartedPulling="2026-04-22 17:53:55.422781969 +0000 UTC m=+3.386071021" lastFinishedPulling="2026-04-22 17:54:12.0411302 +0000 UTC m=+20.004419254" observedRunningTime="2026-04-22 17:54:13.830917193 +0000 UTC m=+21.794206263" watchObservedRunningTime="2026-04-22 17:54:13.831321923 +0000 UTC m=+21.794610992" Apr 22 17:54:14.696920 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:14.696845 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:14.697091 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:14.696846 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:14.697091 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:14.696974 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:14.697091 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:14.697013 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:14.818033 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:14.817991 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" event={"ID":"65197570-5734-4fef-aa1a-0880895dc55b","Type":"ContainerStarted","Data":"65946f2cba1edd9be2ce3e16dc839e51c912ecdcdadc3a58670b986c40b0b29d"} Apr 22 17:54:15.685503 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:15.685454 2527 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9z422" Apr 22 17:54:15.686212 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:15.686179 2527 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9z422" Apr 22 17:54:15.694007 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:15.693979 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:15.694124 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:15.694090 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:15.702081 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:15.702040 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2d8fs" podStartSLOduration=4.733474512 podStartE2EDuration="23.702026393s" podCreationTimestamp="2026-04-22 17:53:52 +0000 UTC" firstStartedPulling="2026-04-22 17:53:55.415588497 +0000 UTC m=+3.378877545" lastFinishedPulling="2026-04-22 17:54:14.384140375 +0000 UTC m=+22.347429426" observedRunningTime="2026-04-22 17:54:14.838973503 +0000 UTC m=+22.802262576" watchObservedRunningTime="2026-04-22 17:54:15.702026393 +0000 UTC m=+23.665315500" Apr 22 17:54:15.822867 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:15.822839 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 17:54:15.823294 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:15.823268 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" event={"ID":"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c","Type":"ContainerStarted","Data":"c6627ba4c8cebda781898fd12c54d8431298a35e211ffc81098c4aca9a926feb"} Apr 22 17:54:16.694020 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:16.693753 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:16.694194 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:16.693812 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:16.694194 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:16.694147 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:16.694194 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:16.694172 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:17.693654 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:17.693619 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:17.694032 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:17.693735 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:17.827484 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:17.827448 2527 generic.go:358] "Generic (PLEG): container finished" podID="5e78c417-84fe-4436-91ea-459e917e9154" containerID="2157767ca4525c658c79902c17cb59a885451c5d1c7c2adc54b4e75f0ffd4568" exitCode=0 Apr 22 17:54:17.827643 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:17.827529 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7h2l" event={"ID":"5e78c417-84fe-4436-91ea-459e917e9154","Type":"ContainerDied","Data":"2157767ca4525c658c79902c17cb59a885451c5d1c7c2adc54b4e75f0ffd4568"} Apr 22 17:54:17.830538 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:17.830521 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 17:54:17.830918 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:17.830872 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" event={"ID":"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c","Type":"ContainerStarted","Data":"37e24d7ebd6e07a694fde129862b55884ce5023902ebff18b9ff7d44ff8833b8"} Apr 22 17:54:17.831251 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:17.831233 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:54:17.831376 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:17.831364 2527 scope.go:117] "RemoveContainer" containerID="f40d6166a6f88a8ea4010df5a81da0f4102681ee311cae670412b3e87b8533a1" Apr 22 17:54:17.845578 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:17.845540 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:54:18.694102 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:18.693938 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:18.694545 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:18.694017 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:18.694545 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:18.694174 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:18.694545 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:18.694321 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:18.836558 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:18.836480 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 17:54:18.837007 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:18.836980 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" event={"ID":"b7681f6e-b80a-4387-bdf9-03efc8cdbd1c","Type":"ContainerStarted","Data":"9aba901ee078d3f837e35ca914686029a5b9240ec4be837a3c75a8235c8a0936"} Apr 22 17:54:18.837425 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:18.837366 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:54:18.837519 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:18.837442 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:54:18.841558 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:18.841528 2527 generic.go:358] "Generic (PLEG): container finished" podID="5e78c417-84fe-4436-91ea-459e917e9154" containerID="a2947b41812b327bbc359964b05dcfa7d6247b83d721cae3fea79ba87505d335" exitCode=0 Apr 22 17:54:18.841682 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:18.841601 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7h2l" event={"ID":"5e78c417-84fe-4436-91ea-459e917e9154","Type":"ContainerDied","Data":"a2947b41812b327bbc359964b05dcfa7d6247b83d721cae3fea79ba87505d335"} Apr 22 17:54:18.853345 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:18.853320 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:54:18.871187 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:18.871137 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" podStartSLOduration=10.030251844 podStartE2EDuration="26.871123486s" podCreationTimestamp="2026-04-22 17:53:52 +0000 UTC" firstStartedPulling="2026-04-22 17:53:55.414422964 +0000 UTC m=+3.377712026" lastFinishedPulling="2026-04-22 17:54:12.25529462 +0000 UTC m=+20.218583668" observedRunningTime="2026-04-22 17:54:18.869919167 +0000 UTC m=+26.833208237" watchObservedRunningTime="2026-04-22 17:54:18.871123486 +0000 UTC m=+26.834412553" Apr 22 17:54:19.084522 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:19.084487 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rh7h6"] Apr 22 17:54:19.084707 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:19.084645 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:19.084796 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:19.084754 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:19.085186 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:19.085166 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4pcg6"] Apr 22 17:54:19.085291 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:19.085261 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:19.085372 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:19.085353 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:19.085872 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:19.085848 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qbfw5"] Apr 22 17:54:19.085960 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:19.085947 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:19.086057 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:19.086035 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:19.845015 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:19.844985 2527 generic.go:358] "Generic (PLEG): container finished" podID="5e78c417-84fe-4436-91ea-459e917e9154" containerID="dfeffe76afab58706e18d54b25692bbc8554a07d1f00b338157d2a3cb7872006" exitCode=0 Apr 22 17:54:19.845598 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:19.845086 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7h2l" event={"ID":"5e78c417-84fe-4436-91ea-459e917e9154","Type":"ContainerDied","Data":"dfeffe76afab58706e18d54b25692bbc8554a07d1f00b338157d2a3cb7872006"} Apr 22 17:54:20.694174 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:20.694142 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:20.694416 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:20.694147 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:20.694416 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:20.694273 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:20.694416 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:20.694148 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:20.694416 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:20.694347 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:20.694416 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:20.694409 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:21.040409 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:21.040369 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9z422" Apr 22 17:54:21.040888 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:21.040515 2527 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:54:21.041065 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:21.041049 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9z422" Apr 22 17:54:22.695375 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:22.695162 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:22.695880 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:22.695220 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:22.695880 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:22.695462 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:22.695880 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:22.695262 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:22.695880 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:22.695593 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:22.695880 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:22.695697 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:24.694197 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:24.694157 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:24.694678 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:24.694280 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbfw5" podUID="758f2892-7f0d-40ae-a65b-7d7ce661548b" Apr 22 17:54:24.694678 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:24.694286 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:24.694678 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:24.694321 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:24.694678 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:24.694378 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:54:24.694678 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:24.694453 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rh7h6" podUID="4424ac43-8016-4c60-8b3b-e1b48a277a63" Apr 22 17:54:25.338445 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.338420 2527 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-21.ec2.internal" event="NodeReady" Apr 22 17:54:25.338709 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.338593 2527 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:54:25.399975 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.399931 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-24hgm"] Apr 22 17:54:25.424686 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.424642 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jlfdb"] Apr 22 17:54:25.424846 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.424828 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.427336 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.427311 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:54:25.427481 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.427442 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:54:25.427761 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.427729 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-q8m4n\"" Apr 22 17:54:25.438491 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.438467 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jlfdb"] Apr 22 17:54:25.438605 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.438496 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-24hgm"] Apr 22 17:54:25.438675 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.438612 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:25.441387 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.441262 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w7jfh\"" Apr 22 17:54:25.441387 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.441331 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:54:25.441590 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.441436 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:54:25.441902 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.441694 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:54:25.518193 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.518157 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3668ec4c-282c-4be6-a5d4-90efe74d754b-config-volume\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.518193 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.518193 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.518497 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.518212 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3668ec4c-282c-4be6-a5d4-90efe74d754b-tmp-dir\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.518497 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.518283 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl4zd\" (UniqueName: \"kubernetes.io/projected/19df5bb9-b377-4d57-ac3d-bdf63a378385-kube-api-access-fl4zd\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:25.518497 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.518344 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:25.518497 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.518390 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7cmn\" (UniqueName: \"kubernetes.io/projected/3668ec4c-282c-4be6-a5d4-90efe74d754b-kube-api-access-g7cmn\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.618762 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.618684 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:25.618762 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.618729 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7cmn\" (UniqueName: \"kubernetes.io/projected/3668ec4c-282c-4be6-a5d4-90efe74d754b-kube-api-access-g7cmn\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.619007 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:25.618841 2527 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:25.619007 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.618920 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3668ec4c-282c-4be6-a5d4-90efe74d754b-config-volume\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.619007 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:25.618962 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert podName:19df5bb9-b377-4d57-ac3d-bdf63a378385 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:26.118935748 +0000 UTC m=+34.082224803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert") pod "ingress-canary-jlfdb" (UID: "19df5bb9-b377-4d57-ac3d-bdf63a378385") : secret "canary-serving-cert" not found Apr 22 17:54:25.619007 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.618994 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.619159 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.619016 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3668ec4c-282c-4be6-a5d4-90efe74d754b-tmp-dir\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.619159 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.619036 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl4zd\" (UniqueName: \"kubernetes.io/projected/19df5bb9-b377-4d57-ac3d-bdf63a378385-kube-api-access-fl4zd\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:25.619159 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:25.619121 2527 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:25.619266 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:25.619185 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls podName:3668ec4c-282c-4be6-a5d4-90efe74d754b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:26.119164019 +0000 UTC m=+34.082453067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls") pod "dns-default-24hgm" (UID: "3668ec4c-282c-4be6-a5d4-90efe74d754b") : secret "dns-default-metrics-tls" not found Apr 22 17:54:25.619266 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.619241 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3668ec4c-282c-4be6-a5d4-90efe74d754b-tmp-dir\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.619397 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.619380 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3668ec4c-282c-4be6-a5d4-90efe74d754b-config-volume\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.629653 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.629634 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7cmn\" (UniqueName: \"kubernetes.io/projected/3668ec4c-282c-4be6-a5d4-90efe74d754b-kube-api-access-g7cmn\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:25.629772 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:25.629700 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl4zd\" (UniqueName: \"kubernetes.io/projected/19df5bb9-b377-4d57-ac3d-bdf63a378385-kube-api-access-fl4zd\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:26.123162 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.123127 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:26.123590 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:26.123279 2527 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:26.123590 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.123319 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:26.123590 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:26.123382 2527 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:26.123590 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:26.123385 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert podName:19df5bb9-b377-4d57-ac3d-bdf63a378385 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:27.12336482 +0000 UTC m=+35.086653870 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert") pod "ingress-canary-jlfdb" (UID: "19df5bb9-b377-4d57-ac3d-bdf63a378385") : secret "canary-serving-cert" not found Apr 22 17:54:26.123590 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:26.123426 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls podName:3668ec4c-282c-4be6-a5d4-90efe74d754b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:27.123417101 +0000 UTC m=+35.086706152 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls") pod "dns-default-24hgm" (UID: "3668ec4c-282c-4be6-a5d4-90efe74d754b") : secret "dns-default-metrics-tls" not found Apr 22 17:54:26.425958 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.425861 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:26.425958 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.425907 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75rwz\" (UniqueName: \"kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz\") pod \"network-check-target-qbfw5\" (UID: \"758f2892-7f0d-40ae-a65b-7d7ce661548b\") " pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:26.426179 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:26.426008 2527 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:26.426179 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:26.426076 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs podName:2e2c7888-11c8-4eb9-af23-d748c79bd567 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:58.426059895 +0000 UTC m=+66.389348943 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs") pod "network-metrics-daemon-4pcg6" (UID: "2e2c7888-11c8-4eb9-af23-d748c79bd567") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:26.426179 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:26.426016 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:54:26.426179 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:26.426114 2527 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:54:26.426179 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:26.426124 2527 projected.go:194] Error preparing data for projected volume kube-api-access-75rwz for pod openshift-network-diagnostics/network-check-target-qbfw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:26.426179 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:26.426164 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz podName:758f2892-7f0d-40ae-a65b-7d7ce661548b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:58.426154133 +0000 UTC m=+66.389443182 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-75rwz" (UniqueName: "kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz") pod "network-check-target-qbfw5" (UID: "758f2892-7f0d-40ae-a65b-7d7ce661548b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:26.694305 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.694203 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:26.694588 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.694214 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:26.694588 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.694214 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:26.698704 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.698615 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bvxbv\"" Apr 22 17:54:26.698704 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.698646 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:54:26.698704 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.698649 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:54:26.698704 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.698648 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lcf4z\"" Apr 22 17:54:26.698974 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.698648 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:54:26.698974 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.698940 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:54:26.859055 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.859018 2527 generic.go:358] "Generic (PLEG): container finished" podID="5e78c417-84fe-4436-91ea-459e917e9154" containerID="413171dd2179cab6c2c686849bed769aeca5e5b808edbd0dd74e4edd48ce3ab6" exitCode=0 Apr 22 17:54:26.859267 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:26.859079 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7h2l" event={"ID":"5e78c417-84fe-4436-91ea-459e917e9154","Type":"ContainerDied","Data":"413171dd2179cab6c2c686849bed769aeca5e5b808edbd0dd74e4edd48ce3ab6"} Apr 22 17:54:27.129650 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:27.129409 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:27.130056 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:27.129694 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:27.130056 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:27.129554 2527 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:27.130056 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:27.129787 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls podName:3668ec4c-282c-4be6-a5d4-90efe74d754b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:29.129759931 +0000 UTC m=+37.093048992 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls") pod "dns-default-24hgm" (UID: "3668ec4c-282c-4be6-a5d4-90efe74d754b") : secret "dns-default-metrics-tls" not found Apr 22 17:54:27.130056 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:27.129806 2527 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:27.130056 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:27.129847 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert podName:19df5bb9-b377-4d57-ac3d-bdf63a378385 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:29.129833852 +0000 UTC m=+37.093122903 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert") pod "ingress-canary-jlfdb" (UID: "19df5bb9-b377-4d57-ac3d-bdf63a378385") : secret "canary-serving-cert" not found Apr 22 17:54:27.863024 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:27.862985 2527 generic.go:358] "Generic (PLEG): container finished" podID="5e78c417-84fe-4436-91ea-459e917e9154" containerID="5da3bcc61a209a28199a229935bd2dcb2077fadace1675a70f7cd0d286de57bd" exitCode=0 Apr 22 17:54:27.863024 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:27.863028 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7h2l" event={"ID":"5e78c417-84fe-4436-91ea-459e917e9154","Type":"ContainerDied","Data":"5da3bcc61a209a28199a229935bd2dcb2077fadace1675a70f7cd0d286de57bd"} Apr 22 17:54:28.867177 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:28.867146 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q7h2l" event={"ID":"5e78c417-84fe-4436-91ea-459e917e9154","Type":"ContainerStarted","Data":"6dc322e5e8530b42f586570f21ddc07a2c9a8faec6529907a3dfdf9bbc6332de"} Apr 22 17:54:28.892836 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:28.892741 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-q7h2l" podStartSLOduration=6.420542928 podStartE2EDuration="36.892722284s" podCreationTimestamp="2026-04-22 17:53:52 +0000 UTC" firstStartedPulling="2026-04-22 17:53:55.423634096 +0000 UTC m=+3.386923154" lastFinishedPulling="2026-04-22 17:54:25.895813462 +0000 UTC m=+33.859102510" observedRunningTime="2026-04-22 17:54:28.891082899 +0000 UTC m=+36.854371982" watchObservedRunningTime="2026-04-22 17:54:28.892722284 +0000 UTC m=+36.856011356" Apr 22 17:54:29.148606 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:29.148496 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:29.148767 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:29.148607 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:29.148885 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:29.148653 2527 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:29.149008 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:29.148991 2527 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:29.149083 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:29.149071 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls podName:3668ec4c-282c-4be6-a5d4-90efe74d754b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:33.149041123 +0000 UTC m=+41.112330173 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls") pod "dns-default-24hgm" (UID: "3668ec4c-282c-4be6-a5d4-90efe74d754b") : secret "dns-default-metrics-tls" not found Apr 22 17:54:29.151404 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:29.149144 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert podName:19df5bb9-b377-4d57-ac3d-bdf63a378385 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:33.14912938 +0000 UTC m=+41.112418494 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert") pod "ingress-canary-jlfdb" (UID: "19df5bb9-b377-4d57-ac3d-bdf63a378385") : secret "canary-serving-cert" not found Apr 22 17:54:29.752685 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:29.752641 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:29.755890 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:29.755871 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4424ac43-8016-4c60-8b3b-e1b48a277a63-original-pull-secret\") pod \"global-pull-secret-syncer-rh7h6\" (UID: \"4424ac43-8016-4c60-8b3b-e1b48a277a63\") " pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:30.016137 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:30.016106 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rh7h6" Apr 22 17:54:30.174706 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:30.174671 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rh7h6"] Apr 22 17:54:30.178430 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:54:30.178406 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4424ac43_8016_4c60_8b3b_e1b48a277a63.slice/crio-7308f401ff0f41be6477a7903caf48797ac90b502acbbc353999e540a7fbb1a6 WatchSource:0}: Error finding container 7308f401ff0f41be6477a7903caf48797ac90b502acbbc353999e540a7fbb1a6: Status 404 returned error can't find the container with id 7308f401ff0f41be6477a7903caf48797ac90b502acbbc353999e540a7fbb1a6 Apr 22 17:54:30.872118 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:30.872085 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rh7h6" event={"ID":"4424ac43-8016-4c60-8b3b-e1b48a277a63","Type":"ContainerStarted","Data":"7308f401ff0f41be6477a7903caf48797ac90b502acbbc353999e540a7fbb1a6"} Apr 22 17:54:33.181360 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:33.181317 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:33.182050 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:33.181375 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:33.182050 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:33.181481 2527 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:33.182050 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:33.181483 2527 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:33.182050 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:33.181538 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert podName:19df5bb9-b377-4d57-ac3d-bdf63a378385 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:41.181520919 +0000 UTC m=+49.144809987 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert") pod "ingress-canary-jlfdb" (UID: "19df5bb9-b377-4d57-ac3d-bdf63a378385") : secret "canary-serving-cert" not found Apr 22 17:54:33.182050 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:33.181553 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls podName:3668ec4c-282c-4be6-a5d4-90efe74d754b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:41.181546418 +0000 UTC m=+49.144835467 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls") pod "dns-default-24hgm" (UID: "3668ec4c-282c-4be6-a5d4-90efe74d754b") : secret "dns-default-metrics-tls" not found Apr 22 17:54:34.881466 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:34.881427 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rh7h6" event={"ID":"4424ac43-8016-4c60-8b3b-e1b48a277a63","Type":"ContainerStarted","Data":"8de67ebf0861844c9b0dea64817531710abf676caaea1ea91da964d6317901c9"} Apr 22 17:54:34.899517 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:34.899472 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rh7h6" podStartSLOduration=34.224579094 podStartE2EDuration="37.899458253s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:54:30.179841187 +0000 UTC m=+38.143130236" lastFinishedPulling="2026-04-22 17:54:33.854720343 +0000 UTC m=+41.818009395" observedRunningTime="2026-04-22 17:54:34.898797724 +0000 UTC m=+42.862086795" watchObservedRunningTime="2026-04-22 17:54:34.899458253 +0000 UTC m=+42.862747323" Apr 22 17:54:41.236275 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:41.236231 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:41.236740 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:41.236296 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:41.236740 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:41.236368 2527 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:41.236740 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:41.236396 2527 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:41.236740 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:41.236451 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls podName:3668ec4c-282c-4be6-a5d4-90efe74d754b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:57.236433647 +0000 UTC m=+65.199722695 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls") pod "dns-default-24hgm" (UID: "3668ec4c-282c-4be6-a5d4-90efe74d754b") : secret "dns-default-metrics-tls" not found Apr 22 17:54:41.236740 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:41.236466 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert podName:19df5bb9-b377-4d57-ac3d-bdf63a378385 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:57.236458766 +0000 UTC m=+65.199747814 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert") pod "ingress-canary-jlfdb" (UID: "19df5bb9-b377-4d57-ac3d-bdf63a378385") : secret "canary-serving-cert" not found Apr 22 17:54:46.723032 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.722996 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r"] Apr 22 17:54:46.762351 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.762323 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r"] Apr 22 17:54:46.762351 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.762351 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk"] Apr 22 17:54:46.762546 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.762462 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" Apr 22 17:54:46.765226 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.765195 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 17:54:46.765382 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.765228 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 17:54:46.765382 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.765271 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 17:54:46.765480 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.765401 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 17:54:46.766344 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.766324 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-qk25r\"" Apr 22 17:54:46.774687 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.774668 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3024a732-80d1-47c9-b5ca-1c6ef741dce0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r\" (UID: \"3024a732-80d1-47c9-b5ca-1c6ef741dce0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" Apr 22 17:54:46.774794 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.774730 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzczr\" (UniqueName: \"kubernetes.io/projected/3024a732-80d1-47c9-b5ca-1c6ef741dce0-kube-api-access-vzczr\") pod \"managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r\" (UID: \"3024a732-80d1-47c9-b5ca-1c6ef741dce0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" Apr 22 17:54:46.780523 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.780492 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk"] Apr 22 17:54:46.780629 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.780616 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.783347 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.783325 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 17:54:46.783429 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.783380 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 17:54:46.783429 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.783408 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 17:54:46.783501 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.783474 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 17:54:46.876009 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.875967 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzczr\" (UniqueName: \"kubernetes.io/projected/3024a732-80d1-47c9-b5ca-1c6ef741dce0-kube-api-access-vzczr\") pod \"managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r\" (UID: \"3024a732-80d1-47c9-b5ca-1c6ef741dce0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" Apr 22 17:54:46.876009 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.876007 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-hub\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.876219 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.876027 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.876219 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.876046 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftwdz\" (UniqueName: \"kubernetes.io/projected/94c6a03c-6ae0-4d10-8642-739528147536-kube-api-access-ftwdz\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.876219 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.876091 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-ca\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.876219 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.876111 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3024a732-80d1-47c9-b5ca-1c6ef741dce0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r\" (UID: \"3024a732-80d1-47c9-b5ca-1c6ef741dce0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" Apr 22 17:54:46.876219 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.876155 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/94c6a03c-6ae0-4d10-8642-739528147536-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.876370 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.876340 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.879419 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.879395 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3024a732-80d1-47c9-b5ca-1c6ef741dce0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r\" (UID: \"3024a732-80d1-47c9-b5ca-1c6ef741dce0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" Apr 22 17:54:46.884776 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.884751 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzczr\" (UniqueName: \"kubernetes.io/projected/3024a732-80d1-47c9-b5ca-1c6ef741dce0-kube-api-access-vzczr\") pod \"managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r\" (UID: \"3024a732-80d1-47c9-b5ca-1c6ef741dce0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" Apr 22 17:54:46.977101 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.977002 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-ca\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.977101 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.977051 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/94c6a03c-6ae0-4d10-8642-739528147536-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.977332 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.977113 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.977332 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.977157 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-hub\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.977332 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.977180 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.977332 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.977201 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftwdz\" (UniqueName: \"kubernetes.io/projected/94c6a03c-6ae0-4d10-8642-739528147536-kube-api-access-ftwdz\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.977967 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.977939 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/94c6a03c-6ae0-4d10-8642-739528147536-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.979539 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.979504 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-ca\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.979654 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.979620 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-hub\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.979994 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.979968 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.979994 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.979986 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/94c6a03c-6ae0-4d10-8642-739528147536-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:46.985674 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:46.985652 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftwdz\" (UniqueName: \"kubernetes.io/projected/94c6a03c-6ae0-4d10-8642-739528147536-kube-api-access-ftwdz\") pod \"cluster-proxy-proxy-agent-5c4979f6c-vmjhk\" (UID: \"94c6a03c-6ae0-4d10-8642-739528147536\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:47.087704 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:47.087674 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" Apr 22 17:54:47.094455 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:47.094425 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:54:47.234460 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:47.234383 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk"] Apr 22 17:54:47.238080 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:54:47.238049 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94c6a03c_6ae0_4d10_8642_739528147536.slice/crio-36ab723d3b3811b40c4fe7ba9e1893e8e7c482bea130dca244cd782847b60f23 WatchSource:0}: Error finding container 36ab723d3b3811b40c4fe7ba9e1893e8e7c482bea130dca244cd782847b60f23: Status 404 returned error can't find the container with id 36ab723d3b3811b40c4fe7ba9e1893e8e7c482bea130dca244cd782847b60f23 Apr 22 17:54:47.250453 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:47.250430 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r"] Apr 22 17:54:47.254215 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:54:47.254191 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3024a732_80d1_47c9_b5ca_1c6ef741dce0.slice/crio-11f3276351deca85ce5bf0674e28001092a9b35fe4d8171c862bf40f51b5e31b WatchSource:0}: Error finding container 11f3276351deca85ce5bf0674e28001092a9b35fe4d8171c862bf40f51b5e31b: Status 404 returned error can't find the container with id 11f3276351deca85ce5bf0674e28001092a9b35fe4d8171c862bf40f51b5e31b Apr 22 17:54:47.908145 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:47.908108 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" event={"ID":"94c6a03c-6ae0-4d10-8642-739528147536","Type":"ContainerStarted","Data":"36ab723d3b3811b40c4fe7ba9e1893e8e7c482bea130dca244cd782847b60f23"} Apr 22 17:54:47.909303 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:47.909272 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" event={"ID":"3024a732-80d1-47c9-b5ca-1c6ef741dce0","Type":"ContainerStarted","Data":"11f3276351deca85ce5bf0674e28001092a9b35fe4d8171c862bf40f51b5e31b"} Apr 22 17:54:50.857825 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:50.857743 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2hklb" Apr 22 17:54:50.918064 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:50.918029 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" event={"ID":"94c6a03c-6ae0-4d10-8642-739528147536","Type":"ContainerStarted","Data":"7dc8759442b09f5b03d9f6292bb3b55821ed648760185b7929a630892cee1557"} Apr 22 17:54:50.919251 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:50.919226 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" event={"ID":"3024a732-80d1-47c9-b5ca-1c6ef741dce0","Type":"ContainerStarted","Data":"05024c40daeb9049355c198e30d0814c06827e3cdb9ede35d9811eff58328f42"} Apr 22 17:54:50.935558 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:50.935507 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" podStartSLOduration=1.6357428010000001 podStartE2EDuration="4.935493123s" podCreationTimestamp="2026-04-22 17:54:46 +0000 UTC" firstStartedPulling="2026-04-22 17:54:47.255980037 +0000 UTC m=+55.219269085" lastFinishedPulling="2026-04-22 17:54:50.55573036 +0000 UTC m=+58.519019407" observedRunningTime="2026-04-22 17:54:50.935020881 +0000 UTC m=+58.898309945" watchObservedRunningTime="2026-04-22 17:54:50.935493123 +0000 UTC m=+58.898782193" Apr 22 17:54:52.925072 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:52.924974 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" event={"ID":"94c6a03c-6ae0-4d10-8642-739528147536","Type":"ContainerStarted","Data":"cb80d4be8acba0ded916f7547cb27db09de673d7eddbe7e8985a96cb82ebc584"} Apr 22 17:54:52.925072 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:52.925017 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" event={"ID":"94c6a03c-6ae0-4d10-8642-739528147536","Type":"ContainerStarted","Data":"16fcd3420e4efa761de1d311780197c9b34aa32a9631991c065d3d440b6fff41"} Apr 22 17:54:52.944365 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:52.944314 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" podStartSLOduration=1.580254582 podStartE2EDuration="6.944297665s" podCreationTimestamp="2026-04-22 17:54:46 +0000 UTC" firstStartedPulling="2026-04-22 17:54:47.239465441 +0000 UTC m=+55.202754493" lastFinishedPulling="2026-04-22 17:54:52.603508524 +0000 UTC m=+60.566797576" observedRunningTime="2026-04-22 17:54:52.94328688 +0000 UTC m=+60.906575950" watchObservedRunningTime="2026-04-22 17:54:52.944297665 +0000 UTC m=+60.907586734" Apr 22 17:54:57.256032 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:57.255988 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:54:57.256414 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:57.256058 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:54:57.256414 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:57.256132 2527 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:57.256414 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:57.256144 2527 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:57.256414 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:57.256194 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert podName:19df5bb9-b377-4d57-ac3d-bdf63a378385 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:29.2561778 +0000 UTC m=+97.219466848 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert") pod "ingress-canary-jlfdb" (UID: "19df5bb9-b377-4d57-ac3d-bdf63a378385") : secret "canary-serving-cert" not found Apr 22 17:54:57.256414 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:57.256208 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls podName:3668ec4c-282c-4be6-a5d4-90efe74d754b nodeName:}" failed. No retries permitted until 2026-04-22 17:55:29.256202956 +0000 UTC m=+97.219492004 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls") pod "dns-default-24hgm" (UID: "3668ec4c-282c-4be6-a5d4-90efe74d754b") : secret "dns-default-metrics-tls" not found Apr 22 17:54:58.464720 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:58.464669 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75rwz\" (UniqueName: \"kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz\") pod \"network-check-target-qbfw5\" (UID: \"758f2892-7f0d-40ae-a65b-7d7ce661548b\") " pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:58.465133 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:58.464785 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:54:58.467733 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:58.467709 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:54:58.467864 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:58.467740 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:54:58.475951 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:58.475924 2527 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:54:58.476044 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:54:58.476009 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs podName:2e2c7888-11c8-4eb9-af23-d748c79bd567 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:02.475987389 +0000 UTC m=+130.439276447 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs") pod "network-metrics-daemon-4pcg6" (UID: "2e2c7888-11c8-4eb9-af23-d748c79bd567") : secret "metrics-daemon-secret" not found Apr 22 17:54:58.478394 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:58.478377 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:54:58.488345 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:58.488323 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75rwz\" (UniqueName: \"kubernetes.io/projected/758f2892-7f0d-40ae-a65b-7d7ce661548b-kube-api-access-75rwz\") pod \"network-check-target-qbfw5\" (UID: \"758f2892-7f0d-40ae-a65b-7d7ce661548b\") " pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:58.508767 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:58.508742 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lcf4z\"" Apr 22 17:54:58.516900 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:58.516879 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:54:58.628283 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:58.628256 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qbfw5"] Apr 22 17:54:58.631549 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:54:58.631519 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod758f2892_7f0d_40ae_a65b_7d7ce661548b.slice/crio-fb460d79bc1b016bd413ac8d437e1127c5497b4c9f34b72d37fb1c8c9f9691d4 WatchSource:0}: Error finding container fb460d79bc1b016bd413ac8d437e1127c5497b4c9f34b72d37fb1c8c9f9691d4: Status 404 returned error can't find the container with id fb460d79bc1b016bd413ac8d437e1127c5497b4c9f34b72d37fb1c8c9f9691d4 Apr 22 17:54:58.936303 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:54:58.936269 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qbfw5" event={"ID":"758f2892-7f0d-40ae-a65b-7d7ce661548b","Type":"ContainerStarted","Data":"fb460d79bc1b016bd413ac8d437e1127c5497b4c9f34b72d37fb1c8c9f9691d4"} Apr 22 17:55:01.947211 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:55:01.947175 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qbfw5" event={"ID":"758f2892-7f0d-40ae-a65b-7d7ce661548b","Type":"ContainerStarted","Data":"498b424056ffddcc2acda47a655152867e8f93ad275b12ea43b31913d0898608"} Apr 22 17:55:01.947558 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:55:01.947325 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:55:01.965956 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:55:01.965910 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qbfw5" podStartSLOduration=67.437566917 podStartE2EDuration="1m9.965895758s" podCreationTimestamp="2026-04-22 17:53:52 +0000 UTC" firstStartedPulling="2026-04-22 17:54:58.633172249 +0000 UTC m=+66.596461297" lastFinishedPulling="2026-04-22 17:55:01.161501076 +0000 UTC m=+69.124790138" observedRunningTime="2026-04-22 17:55:01.965429556 +0000 UTC m=+69.928718640" watchObservedRunningTime="2026-04-22 17:55:01.965895758 +0000 UTC m=+69.929184828" Apr 22 17:55:29.278726 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:55:29.278684 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:55:29.279129 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:55:29.278739 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:55:29.279129 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:55:29.278839 2527 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:55:29.279129 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:55:29.278913 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls podName:3668ec4c-282c-4be6-a5d4-90efe74d754b nodeName:}" failed. No retries permitted until 2026-04-22 17:56:33.278891691 +0000 UTC m=+161.242180738 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls") pod "dns-default-24hgm" (UID: "3668ec4c-282c-4be6-a5d4-90efe74d754b") : secret "dns-default-metrics-tls" not found Apr 22 17:55:29.279129 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:55:29.278851 2527 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:55:29.279129 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:55:29.278989 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert podName:19df5bb9-b377-4d57-ac3d-bdf63a378385 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:33.278977594 +0000 UTC m=+161.242266643 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert") pod "ingress-canary-jlfdb" (UID: "19df5bb9-b377-4d57-ac3d-bdf63a378385") : secret "canary-serving-cert" not found Apr 22 17:55:32.951915 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:55:32.951886 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qbfw5" Apr 22 17:56:02.513053 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:02.513009 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:56:02.513525 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:02.513153 2527 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:56:02.513525 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:02.513228 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs podName:2e2c7888-11c8-4eb9-af23-d748c79bd567 nodeName:}" failed. No retries permitted until 2026-04-22 17:58:04.513210011 +0000 UTC m=+252.476499059 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs") pod "network-metrics-daemon-4pcg6" (UID: "2e2c7888-11c8-4eb9-af23-d748c79bd567") : secret "metrics-daemon-secret" not found Apr 22 17:56:15.465393 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:15.465366 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qqd9n_71ff672a-e8ff-4269-8be1-4e451a7046df/dns-node-resolver/0.log" Apr 22 17:56:16.265855 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:16.265825 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zgpsv_bea09b4b-5a52-4d36-968d-10c68f944bee/node-ca/0.log" Apr 22 17:56:28.435509 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:28.435443 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-24hgm" podUID="3668ec4c-282c-4be6-a5d4-90efe74d754b" Apr 22 17:56:28.448604 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:28.448554 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jlfdb" podUID="19df5bb9-b377-4d57-ac3d-bdf63a378385" Apr 22 17:56:29.147824 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:29.147795 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-24hgm" Apr 22 17:56:29.711495 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:29.711445 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4pcg6" podUID="2e2c7888-11c8-4eb9-af23-d748c79bd567" Apr 22 17:56:33.332664 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:33.332606 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:56:33.333076 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:33.332711 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:56:33.333076 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:33.332749 2527 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:56:33.333076 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:33.332804 2527 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:56:33.333076 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:33.332827 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert podName:19df5bb9-b377-4d57-ac3d-bdf63a378385 nodeName:}" failed. No retries permitted until 2026-04-22 17:58:35.332807722 +0000 UTC m=+283.296096779 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert") pod "ingress-canary-jlfdb" (UID: "19df5bb9-b377-4d57-ac3d-bdf63a378385") : secret "canary-serving-cert" not found Apr 22 17:56:33.333076 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:33.332847 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls podName:3668ec4c-282c-4be6-a5d4-90efe74d754b nodeName:}" failed. No retries permitted until 2026-04-22 17:58:35.33283865 +0000 UTC m=+283.296127711 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls") pod "dns-default-24hgm" (UID: "3668ec4c-282c-4be6-a5d4-90efe74d754b") : secret "dns-default-metrics-tls" not found Apr 22 17:56:40.694200 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:40.694158 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:56:44.694367 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:44.694281 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:56:48.265913 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.265871 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-97npl"] Apr 22 17:56:48.267807 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.267789 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.271693 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.271669 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:56:48.276198 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.276171 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-cl79w\"" Apr 22 17:56:48.276310 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.276205 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:56:48.276310 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.276257 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:56:48.276485 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.276470 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:56:48.285627 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.285603 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-97npl"] Apr 22 17:56:48.341983 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.341940 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-data-volume\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.341983 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.341984 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.342197 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.342016 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.342197 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.342040 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4hqg\" (UniqueName: \"kubernetes.io/projected/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-kube-api-access-f4hqg\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.342197 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.342180 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-crio-socket\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.374241 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.374208 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb"] Apr 22 17:56:48.376155 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.376141 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb" Apr 22 17:56:48.379595 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.379552 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-dxdwn\"" Apr 22 17:56:48.379764 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.379748 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 17:56:48.400051 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.400021 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb"] Apr 22 17:56:48.443385 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.443353 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.443385 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.443387 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4hqg\" (UniqueName: \"kubernetes.io/projected/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-kube-api-access-f4hqg\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.443630 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.443460 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/359d1847-f848-44fb-bc84-c8d778791fa3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7zlkb\" (UID: \"359d1847-f848-44fb-bc84-c8d778791fa3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb" Apr 22 17:56:48.443630 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.443485 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-crio-socket\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.443630 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.443512 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-data-volume\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.443630 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.443588 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.443630 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.443606 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-crio-socket\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.443849 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.443831 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-data-volume\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.443957 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.443942 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.446262 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.446246 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.455504 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.455475 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4hqg\" (UniqueName: \"kubernetes.io/projected/cbbe9b18-4b0b-48db-aed3-04d33d5da7ba-kube-api-access-f4hqg\") pod \"insights-runtime-extractor-97npl\" (UID: \"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba\") " pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.544001 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.543918 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/359d1847-f848-44fb-bc84-c8d778791fa3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7zlkb\" (UID: \"359d1847-f848-44fb-bc84-c8d778791fa3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb" Apr 22 17:56:48.546271 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.546241 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/359d1847-f848-44fb-bc84-c8d778791fa3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7zlkb\" (UID: \"359d1847-f848-44fb-bc84-c8d778791fa3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb" Apr 22 17:56:48.578915 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.578880 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-97npl" Apr 22 17:56:48.684662 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.684633 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb" Apr 22 17:56:48.697761 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.697741 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-97npl"] Apr 22 17:56:48.700235 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:56:48.700203 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbbe9b18_4b0b_48db_aed3_04d33d5da7ba.slice/crio-3207bbd196a7a91053ab781f37dce159605fd7f53bba638b0baabe0381bbcb73 WatchSource:0}: Error finding container 3207bbd196a7a91053ab781f37dce159605fd7f53bba638b0baabe0381bbcb73: Status 404 returned error can't find the container with id 3207bbd196a7a91053ab781f37dce159605fd7f53bba638b0baabe0381bbcb73 Apr 22 17:56:48.808950 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:48.808875 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb"] Apr 22 17:56:48.812334 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:56:48.812298 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod359d1847_f848_44fb_bc84_c8d778791fa3.slice/crio-53e1d2fa064511df719ef42e0819485d355a6bc28cd357483ee923371c6f6e77 WatchSource:0}: Error finding container 53e1d2fa064511df719ef42e0819485d355a6bc28cd357483ee923371c6f6e77: Status 404 returned error can't find the container with id 53e1d2fa064511df719ef42e0819485d355a6bc28cd357483ee923371c6f6e77 Apr 22 17:56:49.196509 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:49.196415 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb" event={"ID":"359d1847-f848-44fb-bc84-c8d778791fa3","Type":"ContainerStarted","Data":"53e1d2fa064511df719ef42e0819485d355a6bc28cd357483ee923371c6f6e77"} Apr 22 17:56:49.197914 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:49.197875 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-97npl" event={"ID":"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba","Type":"ContainerStarted","Data":"cb57ac7f2113b4cdb2fc7cd8430dc599e64c247951126dcef496015ebbf31b37"} Apr 22 17:56:49.197914 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:49.197910 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-97npl" event={"ID":"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba","Type":"ContainerStarted","Data":"3207bbd196a7a91053ab781f37dce159605fd7f53bba638b0baabe0381bbcb73"} Apr 22 17:56:50.201984 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:50.201949 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-97npl" event={"ID":"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba","Type":"ContainerStarted","Data":"9d7c46dd72e11e8c5ded7ecf0e2924b74bfe425817e70b1537a9d9b745717973"} Apr 22 17:56:50.203343 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:50.203315 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb" event={"ID":"359d1847-f848-44fb-bc84-c8d778791fa3","Type":"ContainerStarted","Data":"39abfab11274f8c41b0060edad35bdade9a3f3fb36c75081afa394aca38a6809"} Apr 22 17:56:50.203554 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:50.203526 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb" Apr 22 17:56:50.210344 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:50.210312 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb" Apr 22 17:56:50.236938 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:50.236879 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7zlkb" podStartSLOduration=1.324358132 podStartE2EDuration="2.236863624s" podCreationTimestamp="2026-04-22 17:56:48 +0000 UTC" firstStartedPulling="2026-04-22 17:56:48.814342768 +0000 UTC m=+176.777631816" lastFinishedPulling="2026-04-22 17:56:49.726848246 +0000 UTC m=+177.690137308" observedRunningTime="2026-04-22 17:56:50.236847695 +0000 UTC m=+178.200136765" watchObservedRunningTime="2026-04-22 17:56:50.236863624 +0000 UTC m=+178.200152701" Apr 22 17:56:51.207186 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:51.207154 2527 generic.go:358] "Generic (PLEG): container finished" podID="3024a732-80d1-47c9-b5ca-1c6ef741dce0" containerID="05024c40daeb9049355c198e30d0814c06827e3cdb9ede35d9811eff58328f42" exitCode=255 Apr 22 17:56:51.207657 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:51.207222 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" event={"ID":"3024a732-80d1-47c9-b5ca-1c6ef741dce0","Type":"ContainerDied","Data":"05024c40daeb9049355c198e30d0814c06827e3cdb9ede35d9811eff58328f42"} Apr 22 17:56:51.207657 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:51.207538 2527 scope.go:117] "RemoveContainer" containerID="05024c40daeb9049355c198e30d0814c06827e3cdb9ede35d9811eff58328f42" Apr 22 17:56:51.209089 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:51.209057 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-97npl" event={"ID":"cbbe9b18-4b0b-48db-aed3-04d33d5da7ba","Type":"ContainerStarted","Data":"714d024f9ac0ad98c41edca8a7643889b3cff635283e66bdf278b00d98553176"} Apr 22 17:56:51.258910 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:51.258868 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-97npl" podStartSLOduration=1.3121886790000001 podStartE2EDuration="3.25884974s" podCreationTimestamp="2026-04-22 17:56:48 +0000 UTC" firstStartedPulling="2026-04-22 17:56:48.754810208 +0000 UTC m=+176.718099259" lastFinishedPulling="2026-04-22 17:56:50.701471271 +0000 UTC m=+178.664760320" observedRunningTime="2026-04-22 17:56:51.257812513 +0000 UTC m=+179.221101627" watchObservedRunningTime="2026-04-22 17:56:51.25884974 +0000 UTC m=+179.222138860" Apr 22 17:56:52.214669 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:52.214627 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b8fcd7789-l7p6r" event={"ID":"3024a732-80d1-47c9-b5ca-1c6ef741dce0","Type":"ContainerStarted","Data":"0fefc5fd0eb8bfc93d554a7bae0f4e45146174e3d37da8aea34a265e53135834"} Apr 22 17:56:56.676198 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.676159 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5"] Apr 22 17:56:56.678342 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.678322 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:56.680833 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.680811 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 17:56:56.680979 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.680951 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 17:56:56.681092 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.681076 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qjtbz\"" Apr 22 17:56:56.682056 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.682032 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:56:56.682056 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.682040 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:56:56.682056 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.682051 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:56:56.697545 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.697523 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5"] Apr 22 17:56:56.705680 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.705659 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-955cw"] Apr 22 17:56:56.707516 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.707504 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.710011 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.709992 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:56:56.710136 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.710013 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7s4kf\"" Apr 22 17:56:56.710203 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.710148 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:56:56.710311 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.710290 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:56:56.809660 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.809627 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cca0b693-77d9-4958-aa18-7035cae26b0f-metrics-client-ca\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.809660 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.809664 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cca0b693-77d9-4958-aa18-7035cae26b0f-root\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.809873 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.809682 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.809873 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.809745 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5593f1c-81f8-4b31-99a7-9d09d22babb0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:56.809873 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.809795 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-textfile\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.809873 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.809836 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-wtmp\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.809873 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.809868 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cca0b693-77d9-4958-aa18-7035cae26b0f-sys\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.810023 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.809916 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr894\" (UniqueName: \"kubernetes.io/projected/e5593f1c-81f8-4b31-99a7-9d09d22babb0-kube-api-access-kr894\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:56.810023 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.809938 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-accelerators-collector-config\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.810023 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.809953 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7h4b\" (UniqueName: \"kubernetes.io/projected/cca0b693-77d9-4958-aa18-7035cae26b0f-kube-api-access-p7h4b\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.810023 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.809972 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-tls\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.810023 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.810017 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e5593f1c-81f8-4b31-99a7-9d09d22babb0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:56.810212 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.810046 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5593f1c-81f8-4b31-99a7-9d09d22babb0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:56.911139 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911085 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-textfile\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911139 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911146 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-wtmp\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911357 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911181 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cca0b693-77d9-4958-aa18-7035cae26b0f-sys\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911357 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911215 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kr894\" (UniqueName: \"kubernetes.io/projected/e5593f1c-81f8-4b31-99a7-9d09d22babb0-kube-api-access-kr894\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:56.911357 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911245 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-accelerators-collector-config\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911357 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911274 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7h4b\" (UniqueName: \"kubernetes.io/projected/cca0b693-77d9-4958-aa18-7035cae26b0f-kube-api-access-p7h4b\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911357 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911300 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-tls\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911357 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911314 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cca0b693-77d9-4958-aa18-7035cae26b0f-sys\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911357 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911329 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e5593f1c-81f8-4b31-99a7-9d09d22babb0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:56.911357 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911353 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-wtmp\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911769 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911369 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5593f1c-81f8-4b31-99a7-9d09d22babb0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:56.911769 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911431 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cca0b693-77d9-4958-aa18-7035cae26b0f-metrics-client-ca\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911769 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911467 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cca0b693-77d9-4958-aa18-7035cae26b0f-root\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911769 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911508 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-textfile\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911769 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911512 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cca0b693-77d9-4958-aa18-7035cae26b0f-root\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911769 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911553 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.911769 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911611 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5593f1c-81f8-4b31-99a7-9d09d22babb0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:56.911769 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:56.911699 2527 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 17:56:56.911769 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:56.911762 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5593f1c-81f8-4b31-99a7-9d09d22babb0-openshift-state-metrics-tls podName:e5593f1c-81f8-4b31-99a7-9d09d22babb0 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:57.411743488 +0000 UTC m=+185.375032539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e5593f1c-81f8-4b31-99a7-9d09d22babb0-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-b9dt5" (UID: "e5593f1c-81f8-4b31-99a7-9d09d22babb0") : secret "openshift-state-metrics-tls" not found Apr 22 17:56:56.912194 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:56.911778 2527 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 17:56:56.912194 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:56:56.911860 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-tls podName:cca0b693-77d9-4958-aa18-7035cae26b0f nodeName:}" failed. No retries permitted until 2026-04-22 17:56:57.411836428 +0000 UTC m=+185.375125479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-tls") pod "node-exporter-955cw" (UID: "cca0b693-77d9-4958-aa18-7035cae26b0f") : secret "node-exporter-tls" not found Apr 22 17:56:56.912194 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911893 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-accelerators-collector-config\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.912194 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.911983 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cca0b693-77d9-4958-aa18-7035cae26b0f-metrics-client-ca\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.912194 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.912108 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5593f1c-81f8-4b31-99a7-9d09d22babb0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:56.913842 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.913805 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.913939 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.913844 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e5593f1c-81f8-4b31-99a7-9d09d22babb0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:56.923028 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.923001 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7h4b\" (UniqueName: \"kubernetes.io/projected/cca0b693-77d9-4958-aa18-7035cae26b0f-kube-api-access-p7h4b\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:56.924778 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:56.924757 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr894\" (UniqueName: \"kubernetes.io/projected/e5593f1c-81f8-4b31-99a7-9d09d22babb0-kube-api-access-kr894\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:57.416693 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:57.416661 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5593f1c-81f8-4b31-99a7-9d09d22babb0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:57.416862 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:57.416741 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-tls\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:57.418994 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:57.418965 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cca0b693-77d9-4958-aa18-7035cae26b0f-node-exporter-tls\") pod \"node-exporter-955cw\" (UID: \"cca0b693-77d9-4958-aa18-7035cae26b0f\") " pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:57.419109 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:57.419043 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5593f1c-81f8-4b31-99a7-9d09d22babb0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b9dt5\" (UID: \"e5593f1c-81f8-4b31-99a7-9d09d22babb0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:57.587333 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:57.587286 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" Apr 22 17:56:57.616488 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:57.616459 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-955cw" Apr 22 17:56:57.625640 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:56:57.625594 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcca0b693_77d9_4958_aa18_7035cae26b0f.slice/crio-e1f854bc0e4f5e1aa105e48e0e3c5c0193d1140690e394e037f628ac7671d83d WatchSource:0}: Error finding container e1f854bc0e4f5e1aa105e48e0e3c5c0193d1140690e394e037f628ac7671d83d: Status 404 returned error can't find the container with id e1f854bc0e4f5e1aa105e48e0e3c5c0193d1140690e394e037f628ac7671d83d Apr 22 17:56:57.705223 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:57.705140 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5"] Apr 22 17:56:57.707973 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:56:57.707931 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5593f1c_81f8_4b31_99a7_9d09d22babb0.slice/crio-9f9e4dd2f92785d0ea57386b2cb490f6602e64db260e2d0a911976575a183c4d WatchSource:0}: Error finding container 9f9e4dd2f92785d0ea57386b2cb490f6602e64db260e2d0a911976575a183c4d: Status 404 returned error can't find the container with id 9f9e4dd2f92785d0ea57386b2cb490f6602e64db260e2d0a911976575a183c4d Apr 22 17:56:58.232581 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:58.232531 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" event={"ID":"e5593f1c-81f8-4b31-99a7-9d09d22babb0","Type":"ContainerStarted","Data":"ede8b96345a260482dd7f4a61b66c56c2e2cd21b53af599bee149ef5cac2d5e9"} Apr 22 17:56:58.232777 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:58.232595 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" event={"ID":"e5593f1c-81f8-4b31-99a7-9d09d22babb0","Type":"ContainerStarted","Data":"41f3f2b73ea9a394a8e47d8f32b4f1caa871d548baffbb21228831f355ac3327"} Apr 22 17:56:58.232777 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:58.232611 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" event={"ID":"e5593f1c-81f8-4b31-99a7-9d09d22babb0","Type":"ContainerStarted","Data":"9f9e4dd2f92785d0ea57386b2cb490f6602e64db260e2d0a911976575a183c4d"} Apr 22 17:56:58.233625 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:58.233595 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-955cw" event={"ID":"cca0b693-77d9-4958-aa18-7035cae26b0f","Type":"ContainerStarted","Data":"e1f854bc0e4f5e1aa105e48e0e3c5c0193d1140690e394e037f628ac7671d83d"} Apr 22 17:56:59.238336 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:59.238246 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" event={"ID":"e5593f1c-81f8-4b31-99a7-9d09d22babb0","Type":"ContainerStarted","Data":"a7d4434d40311345c01872204f55d4b5f6043ce1400fa0b502462a51787d487e"} Apr 22 17:56:59.239712 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:59.239688 2527 generic.go:358] "Generic (PLEG): container finished" podID="cca0b693-77d9-4958-aa18-7035cae26b0f" containerID="754b1f84d27ee91e27f1a44697c3db48ac5417dd06a9bafb81f298b0b8317d1f" exitCode=0 Apr 22 17:56:59.239809 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:59.239724 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-955cw" event={"ID":"cca0b693-77d9-4958-aa18-7035cae26b0f","Type":"ContainerDied","Data":"754b1f84d27ee91e27f1a44697c3db48ac5417dd06a9bafb81f298b0b8317d1f"} Apr 22 17:56:59.272523 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:56:59.272484 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9dt5" podStartSLOduration=2.271404407 podStartE2EDuration="3.272469845s" podCreationTimestamp="2026-04-22 17:56:56 +0000 UTC" firstStartedPulling="2026-04-22 17:56:57.812256252 +0000 UTC m=+185.775545300" lastFinishedPulling="2026-04-22 17:56:58.813321689 +0000 UTC m=+186.776610738" observedRunningTime="2026-04-22 17:56:59.271406026 +0000 UTC m=+187.234695105" watchObservedRunningTime="2026-04-22 17:56:59.272469845 +0000 UTC m=+187.235758914" Apr 22 17:57:00.243887 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:00.243852 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-955cw" event={"ID":"cca0b693-77d9-4958-aa18-7035cae26b0f","Type":"ContainerStarted","Data":"a86ba831666749f9935ac430aee97d1ddfc16bb5a77242e5daf709b5510907af"} Apr 22 17:57:00.243887 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:00.243894 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-955cw" event={"ID":"cca0b693-77d9-4958-aa18-7035cae26b0f","Type":"ContainerStarted","Data":"f03d555d0737bd23d4ec1838770382a59eb6edd1fc97e1e2cd807f7dd011ad9b"} Apr 22 17:57:00.265872 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:00.265816 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-955cw" podStartSLOduration=3.600041803 podStartE2EDuration="4.265803605s" podCreationTimestamp="2026-04-22 17:56:56 +0000 UTC" firstStartedPulling="2026-04-22 17:56:57.627364482 +0000 UTC m=+185.590653534" lastFinishedPulling="2026-04-22 17:56:58.293126272 +0000 UTC m=+186.256415336" observedRunningTime="2026-04-22 17:57:00.263942845 +0000 UTC m=+188.227231915" watchObservedRunningTime="2026-04-22 17:57:00.265803605 +0000 UTC m=+188.229092720" Apr 22 17:57:01.112491 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.112458 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-9b5c68cb9-8dr7k"] Apr 22 17:57:01.114277 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.114262 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.121484 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.121459 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 17:57:01.121630 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.121488 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 17:57:01.121630 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.121521 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-992sf79296d5g\"" Apr 22 17:57:01.121630 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.121525 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 17:57:01.121837 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.121631 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 17:57:01.121931 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.121887 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-nw2kf\"" Apr 22 17:57:01.128145 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.128122 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-9b5c68cb9-8dr7k"] Apr 22 17:57:01.252691 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.252657 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.253035 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.252703 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-audit-log\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.253035 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.252724 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-metrics-server-audit-profiles\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.253035 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.252780 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-secret-metrics-server-tls\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.253035 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.252800 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-secret-metrics-server-client-certs\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.253035 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.252859 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z8n2\" (UniqueName: \"kubernetes.io/projected/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-kube-api-access-9z8n2\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.253035 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.252910 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-client-ca-bundle\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.353676 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.353640 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9z8n2\" (UniqueName: \"kubernetes.io/projected/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-kube-api-access-9z8n2\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.353981 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.353959 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-client-ca-bundle\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.354067 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.353994 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.354067 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.354038 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-audit-log\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.354067 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.354057 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-metrics-server-audit-profiles\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.354518 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.354251 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-secret-metrics-server-tls\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.354518 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.354292 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-secret-metrics-server-client-certs\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.354706 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.354548 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-audit-log\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.355002 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.354966 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.355124 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.355097 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-metrics-server-audit-profiles\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.356453 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.356436 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-client-ca-bundle\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.356644 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.356630 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-secret-metrics-server-tls\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.356779 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.356764 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-secret-metrics-server-client-certs\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.362005 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.361984 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z8n2\" (UniqueName: \"kubernetes.io/projected/7c9a4cc5-cc72-4918-aef8-6fb7a25e791a-kube-api-access-9z8n2\") pod \"metrics-server-9b5c68cb9-8dr7k\" (UID: \"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a\") " pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.422773 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.422706 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:01.452780 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.452737 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x"] Apr 22 17:57:01.455898 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.455875 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x" Apr 22 17:57:01.458378 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.458295 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 17:57:01.458612 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.458540 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-55qdk\"" Apr 22 17:57:01.464836 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.464812 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x"] Apr 22 17:57:01.540455 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.540421 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-9b5c68cb9-8dr7k"] Apr 22 17:57:01.543029 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:57:01.543004 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c9a4cc5_cc72_4918_aef8_6fb7a25e791a.slice/crio-52d1e594183304d52909d4c1b2b27c2150650cb439822714756879c807e8418d WatchSource:0}: Error finding container 52d1e594183304d52909d4c1b2b27c2150650cb439822714756879c807e8418d: Status 404 returned error can't find the container with id 52d1e594183304d52909d4c1b2b27c2150650cb439822714756879c807e8418d Apr 22 17:57:01.556118 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.556098 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f83efb09-abff-4bea-897d-73a5f78393b6-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kn55x\" (UID: \"f83efb09-abff-4bea-897d-73a5f78393b6\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x" Apr 22 17:57:01.656705 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.656670 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f83efb09-abff-4bea-897d-73a5f78393b6-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kn55x\" (UID: \"f83efb09-abff-4bea-897d-73a5f78393b6\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x" Apr 22 17:57:01.658924 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.658900 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f83efb09-abff-4bea-897d-73a5f78393b6-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kn55x\" (UID: \"f83efb09-abff-4bea-897d-73a5f78393b6\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x" Apr 22 17:57:01.765605 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.765557 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x" Apr 22 17:57:01.876689 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:01.876659 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x"] Apr 22 17:57:01.879460 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:57:01.879432 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf83efb09_abff_4bea_897d_73a5f78393b6.slice/crio-7ff2e22d411e7f6926d66d4f47ebdc8ee955d8b942e040be07f987a2f1af273b WatchSource:0}: Error finding container 7ff2e22d411e7f6926d66d4f47ebdc8ee955d8b942e040be07f987a2f1af273b: Status 404 returned error can't find the container with id 7ff2e22d411e7f6926d66d4f47ebdc8ee955d8b942e040be07f987a2f1af273b Apr 22 17:57:02.252059 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:02.251965 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" event={"ID":"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a","Type":"ContainerStarted","Data":"52d1e594183304d52909d4c1b2b27c2150650cb439822714756879c807e8418d"} Apr 22 17:57:02.253048 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:02.253022 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x" event={"ID":"f83efb09-abff-4bea-897d-73a5f78393b6","Type":"ContainerStarted","Data":"7ff2e22d411e7f6926d66d4f47ebdc8ee955d8b942e040be07f987a2f1af273b"} Apr 22 17:57:03.257192 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:03.257148 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" event={"ID":"7c9a4cc5-cc72-4918-aef8-6fb7a25e791a","Type":"ContainerStarted","Data":"9ec435059122b0d8b54fecb7ca3c0221d7829a4a2f1bd8409d35a22b90be00cd"} Apr 22 17:57:03.258402 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:03.258378 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x" event={"ID":"f83efb09-abff-4bea-897d-73a5f78393b6","Type":"ContainerStarted","Data":"a4d29c315bed11720d4bc9a21c62286b880c3d3752d3db62cdf504b6566ac3a2"} Apr 22 17:57:03.258626 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:03.258607 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x" Apr 22 17:57:03.263632 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:03.263614 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x" Apr 22 17:57:03.274558 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:03.274518 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" podStartSLOduration=0.702689945 podStartE2EDuration="2.274505793s" podCreationTimestamp="2026-04-22 17:57:01 +0000 UTC" firstStartedPulling="2026-04-22 17:57:01.544813656 +0000 UTC m=+189.508102705" lastFinishedPulling="2026-04-22 17:57:03.11662949 +0000 UTC m=+191.079918553" observedRunningTime="2026-04-22 17:57:03.274174569 +0000 UTC m=+191.237463640" watchObservedRunningTime="2026-04-22 17:57:03.274505793 +0000 UTC m=+191.237794863" Apr 22 17:57:03.289655 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:03.289612 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kn55x" podStartSLOduration=1.053888773 podStartE2EDuration="2.289555855s" podCreationTimestamp="2026-04-22 17:57:01 +0000 UTC" firstStartedPulling="2026-04-22 17:57:01.881269199 +0000 UTC m=+189.844558250" lastFinishedPulling="2026-04-22 17:57:03.116936269 +0000 UTC m=+191.080225332" observedRunningTime="2026-04-22 17:57:03.288643269 +0000 UTC m=+191.251932361" watchObservedRunningTime="2026-04-22 17:57:03.289555855 +0000 UTC m=+191.252844925" Apr 22 17:57:13.264048 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.264015 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55f785678b-b5h2l"] Apr 22 17:57:13.265924 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.265907 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.268431 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.268401 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 17:57:13.268431 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.268422 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 17:57:13.269510 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.269487 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-hpgsb\"" Apr 22 17:57:13.269649 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.269534 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 17:57:13.269649 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.269584 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 17:57:13.269649 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.269610 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 17:57:13.269866 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.269846 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 17:57:13.269951 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.269901 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 17:57:13.279270 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.279246 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55f785678b-b5h2l"] Apr 22 17:57:13.443059 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.443019 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-console-config\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.443205 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.443083 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-service-ca\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.443205 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.443112 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-oauth-serving-cert\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.443205 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.443147 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-oauth-config\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.443205 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.443172 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-serving-cert\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.443352 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.443243 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6csxw\" (UniqueName: \"kubernetes.io/projected/760bf03a-a8bd-4488-b030-306a78ce874e-kube-api-access-6csxw\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.543895 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.543815 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-console-config\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.543895 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.543864 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-service-ca\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.543895 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.543886 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-oauth-serving-cert\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.544155 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.543938 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-oauth-config\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.544155 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.543960 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-serving-cert\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.544262 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.544174 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6csxw\" (UniqueName: \"kubernetes.io/projected/760bf03a-a8bd-4488-b030-306a78ce874e-kube-api-access-6csxw\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.544677 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.544644 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-oauth-serving-cert\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.544677 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.544664 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-console-config\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.544677 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.544672 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-service-ca\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.546524 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.546498 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-oauth-config\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.546646 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.546611 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-serving-cert\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.551739 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.551719 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6csxw\" (UniqueName: \"kubernetes.io/projected/760bf03a-a8bd-4488-b030-306a78ce874e-kube-api-access-6csxw\") pod \"console-55f785678b-b5h2l\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.575214 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.575173 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:13.688486 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:13.688456 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55f785678b-b5h2l"] Apr 22 17:57:13.691405 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:57:13.691379 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod760bf03a_a8bd_4488_b030_306a78ce874e.slice/crio-8d3ee39da2537133fee1094c04483faf124fd1351650fbec3675dd831ff6a2ed WatchSource:0}: Error finding container 8d3ee39da2537133fee1094c04483faf124fd1351650fbec3675dd831ff6a2ed: Status 404 returned error can't find the container with id 8d3ee39da2537133fee1094c04483faf124fd1351650fbec3675dd831ff6a2ed Apr 22 17:57:14.286211 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:14.286168 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f785678b-b5h2l" event={"ID":"760bf03a-a8bd-4488-b030-306a78ce874e","Type":"ContainerStarted","Data":"8d3ee39da2537133fee1094c04483faf124fd1351650fbec3675dd831ff6a2ed"} Apr 22 17:57:17.294880 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:17.294844 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f785678b-b5h2l" event={"ID":"760bf03a-a8bd-4488-b030-306a78ce874e","Type":"ContainerStarted","Data":"c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe"} Apr 22 17:57:17.320320 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:17.320268 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55f785678b-b5h2l" podStartSLOduration=1.7656995420000001 podStartE2EDuration="4.320252119s" podCreationTimestamp="2026-04-22 17:57:13 +0000 UTC" firstStartedPulling="2026-04-22 17:57:13.693319043 +0000 UTC m=+201.656608094" lastFinishedPulling="2026-04-22 17:57:16.247871624 +0000 UTC m=+204.211160671" observedRunningTime="2026-04-22 17:57:17.319535333 +0000 UTC m=+205.282824404" watchObservedRunningTime="2026-04-22 17:57:17.320252119 +0000 UTC m=+205.283541189" Apr 22 17:57:21.423441 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:21.423395 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:21.423441 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:21.423448 2527 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:23.576146 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:23.576085 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:23.576146 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:23.576150 2527 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:23.581046 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:23.581025 2527 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:24.316691 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:24.316659 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:26.106811 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:26.106779 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qqd9n_71ff672a-e8ff-4269-8be1-4e451a7046df/dns-node-resolver/0.log" Apr 22 17:57:32.677380 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:32.677346 2527 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55f785678b-b5h2l"] Apr 22 17:57:41.428966 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:41.428938 2527 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:41.432721 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:41.432697 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-9b5c68cb9-8dr7k" Apr 22 17:57:47.095364 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:47.095291 2527 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" podUID="94c6a03c-6ae0-4d10-8642-739528147536" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:57:57.095709 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.095670 2527 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" podUID="94c6a03c-6ae0-4d10-8642-739528147536" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:57:57.696029 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.695970 2527 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55f785678b-b5h2l" podUID="760bf03a-a8bd-4488-b030-306a78ce874e" containerName="console" containerID="cri-o://c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe" gracePeriod=15 Apr 22 17:57:57.937989 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.937967 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55f785678b-b5h2l_760bf03a-a8bd-4488-b030-306a78ce874e/console/0.log" Apr 22 17:57:57.938100 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.938037 2527 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:57.979613 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.979521 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6csxw\" (UniqueName: \"kubernetes.io/projected/760bf03a-a8bd-4488-b030-306a78ce874e-kube-api-access-6csxw\") pod \"760bf03a-a8bd-4488-b030-306a78ce874e\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " Apr 22 17:57:57.979613 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.979587 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-console-config\") pod \"760bf03a-a8bd-4488-b030-306a78ce874e\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " Apr 22 17:57:57.979820 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.979632 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-service-ca\") pod \"760bf03a-a8bd-4488-b030-306a78ce874e\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " Apr 22 17:57:57.979820 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.979652 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-oauth-config\") pod \"760bf03a-a8bd-4488-b030-306a78ce874e\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " Apr 22 17:57:57.979820 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.979701 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-serving-cert\") pod \"760bf03a-a8bd-4488-b030-306a78ce874e\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " Apr 22 17:57:57.979820 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.979730 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-oauth-serving-cert\") pod \"760bf03a-a8bd-4488-b030-306a78ce874e\" (UID: \"760bf03a-a8bd-4488-b030-306a78ce874e\") " Apr 22 17:57:57.980022 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.979992 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-console-config" (OuterVolumeSpecName: "console-config") pod "760bf03a-a8bd-4488-b030-306a78ce874e" (UID: "760bf03a-a8bd-4488-b030-306a78ce874e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:57.980109 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.980087 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-service-ca" (OuterVolumeSpecName: "service-ca") pod "760bf03a-a8bd-4488-b030-306a78ce874e" (UID: "760bf03a-a8bd-4488-b030-306a78ce874e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:57.980203 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.980182 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "760bf03a-a8bd-4488-b030-306a78ce874e" (UID: "760bf03a-a8bd-4488-b030-306a78ce874e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:57.981883 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.981852 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "760bf03a-a8bd-4488-b030-306a78ce874e" (UID: "760bf03a-a8bd-4488-b030-306a78ce874e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:57.981883 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.981860 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/760bf03a-a8bd-4488-b030-306a78ce874e-kube-api-access-6csxw" (OuterVolumeSpecName: "kube-api-access-6csxw") pod "760bf03a-a8bd-4488-b030-306a78ce874e" (UID: "760bf03a-a8bd-4488-b030-306a78ce874e"). InnerVolumeSpecName "kube-api-access-6csxw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:57.982029 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:57.981869 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "760bf03a-a8bd-4488-b030-306a78ce874e" (UID: "760bf03a-a8bd-4488-b030-306a78ce874e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:58.080661 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.080620 2527 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-serving-cert\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 17:57:58.080661 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.080653 2527 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-oauth-serving-cert\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 17:57:58.080661 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.080663 2527 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6csxw\" (UniqueName: \"kubernetes.io/projected/760bf03a-a8bd-4488-b030-306a78ce874e-kube-api-access-6csxw\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 17:57:58.080883 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.080673 2527 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-console-config\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 17:57:58.080883 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.080683 2527 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/760bf03a-a8bd-4488-b030-306a78ce874e-service-ca\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 17:57:58.080883 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.080691 2527 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/760bf03a-a8bd-4488-b030-306a78ce874e-console-oauth-config\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 17:57:58.404815 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.404789 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55f785678b-b5h2l_760bf03a-a8bd-4488-b030-306a78ce874e/console/0.log" Apr 22 17:57:58.405287 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.404828 2527 generic.go:358] "Generic (PLEG): container finished" podID="760bf03a-a8bd-4488-b030-306a78ce874e" containerID="c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe" exitCode=2 Apr 22 17:57:58.405287 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.404875 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f785678b-b5h2l" event={"ID":"760bf03a-a8bd-4488-b030-306a78ce874e","Type":"ContainerDied","Data":"c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe"} Apr 22 17:57:58.405287 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.404898 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f785678b-b5h2l" event={"ID":"760bf03a-a8bd-4488-b030-306a78ce874e","Type":"ContainerDied","Data":"8d3ee39da2537133fee1094c04483faf124fd1351650fbec3675dd831ff6a2ed"} Apr 22 17:57:58.405287 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.404926 2527 scope.go:117] "RemoveContainer" containerID="c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe" Apr 22 17:57:58.405287 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.404941 2527 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f785678b-b5h2l" Apr 22 17:57:58.412916 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.412896 2527 scope.go:117] "RemoveContainer" containerID="c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe" Apr 22 17:57:58.413203 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:57:58.413181 2527 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe\": container with ID starting with c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe not found: ID does not exist" containerID="c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe" Apr 22 17:57:58.413261 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.413211 2527 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe"} err="failed to get container status \"c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe\": rpc error: code = NotFound desc = could not find container \"c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe\": container with ID starting with c58ea49fae3ed97692c2a34d26178499c367fde9d1f62fa5a73840fcd11369fe not found: ID does not exist" Apr 22 17:57:58.425661 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.425637 2527 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55f785678b-b5h2l"] Apr 22 17:57:58.428866 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.428843 2527 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55f785678b-b5h2l"] Apr 22 17:57:58.697750 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:57:58.697673 2527 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="760bf03a-a8bd-4488-b030-306a78ce874e" path="/var/lib/kubelet/pods/760bf03a-a8bd-4488-b030-306a78ce874e/volumes" Apr 22 17:58:04.529703 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:04.529061 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:58:04.532731 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:04.532598 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2c7888-11c8-4eb9-af23-d748c79bd567-metrics-certs\") pod \"network-metrics-daemon-4pcg6\" (UID: \"2e2c7888-11c8-4eb9-af23-d748c79bd567\") " pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:58:04.797651 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:04.797552 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bvxbv\"" Apr 22 17:58:04.805521 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:04.805500 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4pcg6" Apr 22 17:58:04.919708 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:04.919677 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4pcg6"] Apr 22 17:58:04.922105 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:58:04.922078 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e2c7888_11c8_4eb9_af23_d748c79bd567.slice/crio-9516d15d54622fa2022fbece49e64c1072f819b68d1da85baaefa860d5c4e3b5 WatchSource:0}: Error finding container 9516d15d54622fa2022fbece49e64c1072f819b68d1da85baaefa860d5c4e3b5: Status 404 returned error can't find the container with id 9516d15d54622fa2022fbece49e64c1072f819b68d1da85baaefa860d5c4e3b5 Apr 22 17:58:05.426010 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:05.425978 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4pcg6" event={"ID":"2e2c7888-11c8-4eb9-af23-d748c79bd567","Type":"ContainerStarted","Data":"9516d15d54622fa2022fbece49e64c1072f819b68d1da85baaefa860d5c4e3b5"} Apr 22 17:58:06.431102 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:06.431069 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4pcg6" event={"ID":"2e2c7888-11c8-4eb9-af23-d748c79bd567","Type":"ContainerStarted","Data":"45a5f5993117aed8bbf207963f1f585323241afba84ae02680e797f966a3f0c5"} Apr 22 17:58:06.431102 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:06.431104 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4pcg6" event={"ID":"2e2c7888-11c8-4eb9-af23-d748c79bd567","Type":"ContainerStarted","Data":"b2dabbc1e501ef60fc5b6f187c15dad1e89a5b9bb83d27fe0f7247f5d9df7d63"} Apr 22 17:58:06.448533 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:06.448484 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4pcg6" podStartSLOduration=253.387556833 podStartE2EDuration="4m14.448469787s" podCreationTimestamp="2026-04-22 17:53:52 +0000 UTC" firstStartedPulling="2026-04-22 17:58:04.923947108 +0000 UTC m=+252.887236170" lastFinishedPulling="2026-04-22 17:58:05.984860074 +0000 UTC m=+253.948149124" observedRunningTime="2026-04-22 17:58:06.447380474 +0000 UTC m=+254.410669545" watchObservedRunningTime="2026-04-22 17:58:06.448469787 +0000 UTC m=+254.411758857" Apr 22 17:58:07.096155 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:07.096115 2527 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" podUID="94c6a03c-6ae0-4d10-8642-739528147536" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:58:07.096316 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:07.096179 2527 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" Apr 22 17:58:07.096657 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:07.096638 2527 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"cb80d4be8acba0ded916f7547cb27db09de673d7eddbe7e8985a96cb82ebc584"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 17:58:07.096709 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:07.096677 2527 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" podUID="94c6a03c-6ae0-4d10-8642-739528147536" containerName="service-proxy" containerID="cri-o://cb80d4be8acba0ded916f7547cb27db09de673d7eddbe7e8985a96cb82ebc584" gracePeriod=30 Apr 22 17:58:07.435968 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:07.435884 2527 generic.go:358] "Generic (PLEG): container finished" podID="94c6a03c-6ae0-4d10-8642-739528147536" containerID="cb80d4be8acba0ded916f7547cb27db09de673d7eddbe7e8985a96cb82ebc584" exitCode=2 Apr 22 17:58:07.435968 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:07.435944 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" event={"ID":"94c6a03c-6ae0-4d10-8642-739528147536","Type":"ContainerDied","Data":"cb80d4be8acba0ded916f7547cb27db09de673d7eddbe7e8985a96cb82ebc584"} Apr 22 17:58:07.436341 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:07.435982 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c4979f6c-vmjhk" event={"ID":"94c6a03c-6ae0-4d10-8642-739528147536","Type":"ContainerStarted","Data":"b4dede6d736893a4e2de209884be9183b81f2920ef73185b7a93b5f12ded91a4"} Apr 22 17:58:25.630905 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.630829 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cc6595f5d-zt47n"] Apr 22 17:58:25.631299 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.631128 2527 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="760bf03a-a8bd-4488-b030-306a78ce874e" containerName="console" Apr 22 17:58:25.631299 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.631141 2527 state_mem.go:107] "Deleted CPUSet assignment" podUID="760bf03a-a8bd-4488-b030-306a78ce874e" containerName="console" Apr 22 17:58:25.631299 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.631178 2527 memory_manager.go:356] "RemoveStaleState removing state" podUID="760bf03a-a8bd-4488-b030-306a78ce874e" containerName="console" Apr 22 17:58:25.634405 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.634389 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.641392 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.641369 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 17:58:25.641529 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.641379 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 17:58:25.641529 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.641456 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 17:58:25.641529 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.641443 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 17:58:25.641742 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.641441 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 17:58:25.642424 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.642407 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-hpgsb\"" Apr 22 17:58:25.642508 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.642443 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 17:58:25.642508 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.642467 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 17:58:25.652440 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.652412 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cc6595f5d-zt47n"] Apr 22 17:58:25.654488 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.654470 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 17:58:25.693715 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.693688 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-service-ca\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.693893 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.693726 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-oauth-config\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.693893 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.693754 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-serving-cert\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.693893 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.693783 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6w9h\" (UniqueName: \"kubernetes.io/projected/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-kube-api-access-s6w9h\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.693893 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.693800 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-config\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.694097 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.693917 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-oauth-serving-cert\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.694097 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.693972 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-trusted-ca-bundle\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.795354 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.795304 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-oauth-serving-cert\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.795516 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.795371 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-trusted-ca-bundle\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.795516 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.795420 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-service-ca\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.795516 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.795444 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-oauth-config\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.795516 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.795468 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-serving-cert\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.795516 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.795491 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6w9h\" (UniqueName: \"kubernetes.io/projected/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-kube-api-access-s6w9h\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.795806 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.795683 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-config\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.796144 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.796121 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-service-ca\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.796256 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.796160 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-oauth-serving-cert\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.796256 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.796236 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-trusted-ca-bundle\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.796399 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.796376 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-config\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.797886 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.797857 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-oauth-config\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.798129 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.798111 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-serving-cert\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.803109 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.803083 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6w9h\" (UniqueName: \"kubernetes.io/projected/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-kube-api-access-s6w9h\") pod \"console-cc6595f5d-zt47n\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:25.943778 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:25.943687 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:26.060509 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:26.060481 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cc6595f5d-zt47n"] Apr 22 17:58:26.063179 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:58:26.063147 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2a2adc5_787a_47a6_ac4d_120c8ac49a9e.slice/crio-ab9f7203967222a6670c8b1d5c86a1416d9d81d7d50d64f00dcf1021f2257855 WatchSource:0}: Error finding container ab9f7203967222a6670c8b1d5c86a1416d9d81d7d50d64f00dcf1021f2257855: Status 404 returned error can't find the container with id ab9f7203967222a6670c8b1d5c86a1416d9d81d7d50d64f00dcf1021f2257855 Apr 22 17:58:26.486547 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:26.486512 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc6595f5d-zt47n" event={"ID":"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e","Type":"ContainerStarted","Data":"c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf"} Apr 22 17:58:26.486547 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:26.486550 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc6595f5d-zt47n" event={"ID":"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e","Type":"ContainerStarted","Data":"ab9f7203967222a6670c8b1d5c86a1416d9d81d7d50d64f00dcf1021f2257855"} Apr 22 17:58:26.507410 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:26.507359 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cc6595f5d-zt47n" podStartSLOduration=1.507345767 podStartE2EDuration="1.507345767s" podCreationTimestamp="2026-04-22 17:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:58:26.5067034 +0000 UTC m=+274.469992507" watchObservedRunningTime="2026-04-22 17:58:26.507345767 +0000 UTC m=+274.470634837" Apr 22 17:58:32.148440 ip-10-0-135-21 kubenswrapper[2527]: E0422 17:58:32.148399 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-24hgm" podUID="3668ec4c-282c-4be6-a5d4-90efe74d754b" Apr 22 17:58:32.500994 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:32.500917 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-24hgm" Apr 22 17:58:35.372531 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.372498 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:58:35.373010 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.372588 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:58:35.374962 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.374930 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3668ec4c-282c-4be6-a5d4-90efe74d754b-metrics-tls\") pod \"dns-default-24hgm\" (UID: \"3668ec4c-282c-4be6-a5d4-90efe74d754b\") " pod="openshift-dns/dns-default-24hgm" Apr 22 17:58:35.375062 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.374988 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19df5bb9-b377-4d57-ac3d-bdf63a378385-cert\") pod \"ingress-canary-jlfdb\" (UID: \"19df5bb9-b377-4d57-ac3d-bdf63a378385\") " pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:58:35.507143 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.507117 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-q8m4n\"" Apr 22 17:58:35.511781 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.511759 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-24hgm" Apr 22 17:58:35.597635 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.597611 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w7jfh\"" Apr 22 17:58:35.605521 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.605495 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jlfdb" Apr 22 17:58:35.629284 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.629199 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-24hgm"] Apr 22 17:58:35.632902 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:58:35.632861 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3668ec4c_282c_4be6_a5d4_90efe74d754b.slice/crio-79bcf32ee7461e7314958c3cc488cef31899dd85411150edeecc6307074ce091 WatchSource:0}: Error finding container 79bcf32ee7461e7314958c3cc488cef31899dd85411150edeecc6307074ce091: Status 404 returned error can't find the container with id 79bcf32ee7461e7314958c3cc488cef31899dd85411150edeecc6307074ce091 Apr 22 17:58:35.723954 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.723921 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jlfdb"] Apr 22 17:58:35.727367 ip-10-0-135-21 kubenswrapper[2527]: W0422 17:58:35.727334 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19df5bb9_b377_4d57_ac3d_bdf63a378385.slice/crio-a69df1a9b2892af16a88fd4808c0f2c8c3fa555073b0c11514aa896051f88968 WatchSource:0}: Error finding container a69df1a9b2892af16a88fd4808c0f2c8c3fa555073b0c11514aa896051f88968: Status 404 returned error can't find the container with id a69df1a9b2892af16a88fd4808c0f2c8c3fa555073b0c11514aa896051f88968 Apr 22 17:58:35.944144 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.944059 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:35.944144 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.944103 2527 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:35.948704 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:35.948682 2527 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:36.513393 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:36.513342 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-24hgm" event={"ID":"3668ec4c-282c-4be6-a5d4-90efe74d754b","Type":"ContainerStarted","Data":"79bcf32ee7461e7314958c3cc488cef31899dd85411150edeecc6307074ce091"} Apr 22 17:58:36.514982 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:36.514949 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jlfdb" event={"ID":"19df5bb9-b377-4d57-ac3d-bdf63a378385","Type":"ContainerStarted","Data":"a69df1a9b2892af16a88fd4808c0f2c8c3fa555073b0c11514aa896051f88968"} Apr 22 17:58:36.520925 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:36.520898 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 17:58:38.523496 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:38.523459 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-24hgm" event={"ID":"3668ec4c-282c-4be6-a5d4-90efe74d754b","Type":"ContainerStarted","Data":"4abb762e9c567f4222213c374a69595d1a63b19a2a395d4b520dc1042f464019"} Apr 22 17:58:38.523496 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:38.523498 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-24hgm" event={"ID":"3668ec4c-282c-4be6-a5d4-90efe74d754b","Type":"ContainerStarted","Data":"4733371aa7d25d40a529aeb075937d237d7c1c98480619a4955893e7389bc93a"} Apr 22 17:58:38.523996 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:38.523585 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-24hgm" Apr 22 17:58:38.524842 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:38.524820 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jlfdb" event={"ID":"19df5bb9-b377-4d57-ac3d-bdf63a378385","Type":"ContainerStarted","Data":"8d2daedf89eaac74f1d1fb1b701ea8857e799e99149ac9fe3e62aad61d160d64"} Apr 22 17:58:38.542767 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:38.542717 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-24hgm" podStartSLOduration=251.600177127 podStartE2EDuration="4m13.542701106s" podCreationTimestamp="2026-04-22 17:54:25 +0000 UTC" firstStartedPulling="2026-04-22 17:58:35.634661586 +0000 UTC m=+283.597950634" lastFinishedPulling="2026-04-22 17:58:37.577185555 +0000 UTC m=+285.540474613" observedRunningTime="2026-04-22 17:58:38.54214883 +0000 UTC m=+286.505437899" watchObservedRunningTime="2026-04-22 17:58:38.542701106 +0000 UTC m=+286.505990177" Apr 22 17:58:38.558424 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:38.558384 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jlfdb" podStartSLOduration=251.70712801 podStartE2EDuration="4m13.558374213s" podCreationTimestamp="2026-04-22 17:54:25 +0000 UTC" firstStartedPulling="2026-04-22 17:58:35.729105852 +0000 UTC m=+283.692394899" lastFinishedPulling="2026-04-22 17:58:37.580352043 +0000 UTC m=+285.543641102" observedRunningTime="2026-04-22 17:58:38.557612776 +0000 UTC m=+286.520901847" watchObservedRunningTime="2026-04-22 17:58:38.558374213 +0000 UTC m=+286.521663283" Apr 22 17:58:48.530255 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:48.530224 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-24hgm" Apr 22 17:58:52.568591 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:52.568550 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 17:58:52.569155 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:52.569136 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 17:58:52.571728 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:58:52.571712 2527 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:59:46.035387 ip-10-0-135-21 kubenswrapper[2527]: I0422 17:59:46.035304 2527 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cc6595f5d-zt47n"] Apr 22 18:00:11.054034 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.053992 2527 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-cc6595f5d-zt47n" podUID="e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" containerName="console" containerID="cri-o://c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf" gracePeriod=15 Apr 22 18:00:11.281986 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.281962 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cc6595f5d-zt47n_e2a2adc5-787a-47a6-ac4d-120c8ac49a9e/console/0.log" Apr 22 18:00:11.282107 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.282021 2527 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 18:00:11.372084 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.371982 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-oauth-serving-cert\") pod \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " Apr 22 18:00:11.372084 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.372050 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6w9h\" (UniqueName: \"kubernetes.io/projected/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-kube-api-access-s6w9h\") pod \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " Apr 22 18:00:11.372302 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.372104 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-config\") pod \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " Apr 22 18:00:11.372302 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.372128 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-service-ca\") pod \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " Apr 22 18:00:11.372302 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.372149 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-serving-cert\") pod \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " Apr 22 18:00:11.372302 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.372172 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-trusted-ca-bundle\") pod \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " Apr 22 18:00:11.372302 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.372204 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-oauth-config\") pod \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\" (UID: \"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e\") " Apr 22 18:00:11.372544 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.372520 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-config" (OuterVolumeSpecName: "console-config") pod "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" (UID: "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:00:11.372635 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.372538 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" (UID: "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:00:11.372635 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.372556 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-service-ca" (OuterVolumeSpecName: "service-ca") pod "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" (UID: "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:00:11.372853 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.372824 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" (UID: "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:00:11.374955 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.374404 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" (UID: "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:00:11.374955 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.374440 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-kube-api-access-s6w9h" (OuterVolumeSpecName: "kube-api-access-s6w9h") pod "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" (UID: "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e"). InnerVolumeSpecName "kube-api-access-s6w9h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:00:11.375189 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.374964 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" (UID: "e2a2adc5-787a-47a6-ac4d-120c8ac49a9e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:00:11.473218 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.473175 2527 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-oauth-config\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 18:00:11.473218 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.473208 2527 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-oauth-serving-cert\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 18:00:11.473218 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.473219 2527 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s6w9h\" (UniqueName: \"kubernetes.io/projected/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-kube-api-access-s6w9h\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 18:00:11.473218 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.473228 2527 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-config\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 18:00:11.473480 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.473237 2527 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-service-ca\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 18:00:11.473480 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.473245 2527 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-console-serving-cert\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 18:00:11.473480 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.473253 2527 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e-trusted-ca-bundle\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 18:00:11.760631 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.760607 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cc6595f5d-zt47n_e2a2adc5-787a-47a6-ac4d-120c8ac49a9e/console/0.log" Apr 22 18:00:11.760774 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.760645 2527 generic.go:358] "Generic (PLEG): container finished" podID="e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" containerID="c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf" exitCode=2 Apr 22 18:00:11.760774 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.760674 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc6595f5d-zt47n" event={"ID":"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e","Type":"ContainerDied","Data":"c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf"} Apr 22 18:00:11.760774 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.760714 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc6595f5d-zt47n" event={"ID":"e2a2adc5-787a-47a6-ac4d-120c8ac49a9e","Type":"ContainerDied","Data":"ab9f7203967222a6670c8b1d5c86a1416d9d81d7d50d64f00dcf1021f2257855"} Apr 22 18:00:11.760774 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.760719 2527 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc6595f5d-zt47n" Apr 22 18:00:11.760774 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.760728 2527 scope.go:117] "RemoveContainer" containerID="c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf" Apr 22 18:00:11.768416 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.768394 2527 scope.go:117] "RemoveContainer" containerID="c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf" Apr 22 18:00:11.768761 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:00:11.768741 2527 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf\": container with ID starting with c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf not found: ID does not exist" containerID="c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf" Apr 22 18:00:11.768812 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.768768 2527 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf"} err="failed to get container status \"c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf\": rpc error: code = NotFound desc = could not find container \"c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf\": container with ID starting with c9f0447ef23c0de7063aa6ec80b1164262a352a929c98f29f70ae755aa5d55cf not found: ID does not exist" Apr 22 18:00:11.781131 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.781108 2527 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cc6595f5d-zt47n"] Apr 22 18:00:11.784372 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:11.784350 2527 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cc6595f5d-zt47n"] Apr 22 18:00:12.698290 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:12.698251 2527 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" path="/var/lib/kubelet/pods/e2a2adc5-787a-47a6-ac4d-120c8ac49a9e/volumes" Apr 22 18:00:55.481306 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.481276 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6"] Apr 22 18:00:55.481716 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.481517 2527 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" containerName="console" Apr 22 18:00:55.481716 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.481529 2527 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" containerName="console" Apr 22 18:00:55.481716 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.481585 2527 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2a2adc5-787a-47a6-ac4d-120c8ac49a9e" containerName="console" Apr 22 18:00:55.483510 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.483494 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:00:55.485903 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.485875 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mflwr\"" Apr 22 18:00:55.485903 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.485893 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:00:55.486996 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.486979 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:00:55.494134 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.494109 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6"] Apr 22 18:00:55.676721 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.676683 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng8vn\" (UniqueName: \"kubernetes.io/projected/d05e451a-217c-4be9-80e4-3e2840bc2e55-kube-api-access-ng8vn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:00:55.676900 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.676733 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:00:55.676900 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.676788 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:00:55.777773 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.777741 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:00:55.777919 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.777789 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng8vn\" (UniqueName: \"kubernetes.io/projected/d05e451a-217c-4be9-80e4-3e2840bc2e55-kube-api-access-ng8vn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:00:55.777919 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.777825 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:00:55.778148 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.778131 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:00:55.778183 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.778172 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:00:55.786721 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.786696 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng8vn\" (UniqueName: \"kubernetes.io/projected/d05e451a-217c-4be9-80e4-3e2840bc2e55-kube-api-access-ng8vn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:00:55.792472 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.792452 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:00:55.912483 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.912457 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6"] Apr 22 18:00:55.914994 ip-10-0-135-21 kubenswrapper[2527]: W0422 18:00:55.914965 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd05e451a_217c_4be9_80e4_3e2840bc2e55.slice/crio-0f53e1be9e287f8f80be0acf474705f91d35ceb43dbe6aea83276d32fc53d6ed WatchSource:0}: Error finding container 0f53e1be9e287f8f80be0acf474705f91d35ceb43dbe6aea83276d32fc53d6ed: Status 404 returned error can't find the container with id 0f53e1be9e287f8f80be0acf474705f91d35ceb43dbe6aea83276d32fc53d6ed Apr 22 18:00:55.916795 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:55.916778 2527 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:00:56.881842 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:00:56.881793 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" event={"ID":"d05e451a-217c-4be9-80e4-3e2840bc2e55","Type":"ContainerStarted","Data":"0f53e1be9e287f8f80be0acf474705f91d35ceb43dbe6aea83276d32fc53d6ed"} Apr 22 18:01:00.895743 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:00.895707 2527 generic.go:358] "Generic (PLEG): container finished" podID="d05e451a-217c-4be9-80e4-3e2840bc2e55" containerID="c120a0332d5b52acad1c26d368db453fa97b973e247276c6e81ec19a3358e16b" exitCode=0 Apr 22 18:01:00.896200 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:00.895791 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" event={"ID":"d05e451a-217c-4be9-80e4-3e2840bc2e55","Type":"ContainerDied","Data":"c120a0332d5b52acad1c26d368db453fa97b973e247276c6e81ec19a3358e16b"} Apr 22 18:01:03.906175 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:03.906135 2527 generic.go:358] "Generic (PLEG): container finished" podID="d05e451a-217c-4be9-80e4-3e2840bc2e55" containerID="65ebab7f8839c138301f0a22bc52236ff3946504c8cfde66e364d3adb06adf8d" exitCode=0 Apr 22 18:01:03.906581 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:03.906250 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" event={"ID":"d05e451a-217c-4be9-80e4-3e2840bc2e55","Type":"ContainerDied","Data":"65ebab7f8839c138301f0a22bc52236ff3946504c8cfde66e364d3adb06adf8d"} Apr 22 18:01:10.928499 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:10.928460 2527 generic.go:358] "Generic (PLEG): container finished" podID="d05e451a-217c-4be9-80e4-3e2840bc2e55" containerID="6009f35a12a72f6e004272f33943b5478a74867543062c5da9e6ef36a7d09407" exitCode=0 Apr 22 18:01:10.928953 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:10.928509 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" event={"ID":"d05e451a-217c-4be9-80e4-3e2840bc2e55","Type":"ContainerDied","Data":"6009f35a12a72f6e004272f33943b5478a74867543062c5da9e6ef36a7d09407"} Apr 22 18:01:12.043210 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.043187 2527 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:01:12.082252 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.082223 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng8vn\" (UniqueName: \"kubernetes.io/projected/d05e451a-217c-4be9-80e4-3e2840bc2e55-kube-api-access-ng8vn\") pod \"d05e451a-217c-4be9-80e4-3e2840bc2e55\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " Apr 22 18:01:12.082416 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.082262 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-bundle\") pod \"d05e451a-217c-4be9-80e4-3e2840bc2e55\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " Apr 22 18:01:12.082884 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.082857 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-bundle" (OuterVolumeSpecName: "bundle") pod "d05e451a-217c-4be9-80e4-3e2840bc2e55" (UID: "d05e451a-217c-4be9-80e4-3e2840bc2e55"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:12.084345 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.084306 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05e451a-217c-4be9-80e4-3e2840bc2e55-kube-api-access-ng8vn" (OuterVolumeSpecName: "kube-api-access-ng8vn") pod "d05e451a-217c-4be9-80e4-3e2840bc2e55" (UID: "d05e451a-217c-4be9-80e4-3e2840bc2e55"). InnerVolumeSpecName "kube-api-access-ng8vn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:01:12.182886 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.182797 2527 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-util\") pod \"d05e451a-217c-4be9-80e4-3e2840bc2e55\" (UID: \"d05e451a-217c-4be9-80e4-3e2840bc2e55\") " Apr 22 18:01:12.183059 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.182911 2527 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ng8vn\" (UniqueName: \"kubernetes.io/projected/d05e451a-217c-4be9-80e4-3e2840bc2e55-kube-api-access-ng8vn\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 18:01:12.183059 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.182923 2527 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-bundle\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 18:01:12.186944 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.186910 2527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-util" (OuterVolumeSpecName: "util") pod "d05e451a-217c-4be9-80e4-3e2840bc2e55" (UID: "d05e451a-217c-4be9-80e4-3e2840bc2e55"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:01:12.283899 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.283864 2527 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d05e451a-217c-4be9-80e4-3e2840bc2e55-util\") on node \"ip-10-0-135-21.ec2.internal\" DevicePath \"\"" Apr 22 18:01:12.935175 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.935138 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" event={"ID":"d05e451a-217c-4be9-80e4-3e2840bc2e55","Type":"ContainerDied","Data":"0f53e1be9e287f8f80be0acf474705f91d35ceb43dbe6aea83276d32fc53d6ed"} Apr 22 18:01:12.935175 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.935173 2527 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f53e1be9e287f8f80be0acf474705f91d35ceb43dbe6aea83276d32fc53d6ed" Apr 22 18:01:12.935368 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:12.935192 2527 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmpmk6" Apr 22 18:01:23.409654 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.409618 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-qvmpl"] Apr 22 18:01:23.410147 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.409896 2527 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d05e451a-217c-4be9-80e4-3e2840bc2e55" containerName="util" Apr 22 18:01:23.410147 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.409908 2527 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05e451a-217c-4be9-80e4-3e2840bc2e55" containerName="util" Apr 22 18:01:23.410147 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.409917 2527 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d05e451a-217c-4be9-80e4-3e2840bc2e55" containerName="extract" Apr 22 18:01:23.410147 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.409922 2527 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05e451a-217c-4be9-80e4-3e2840bc2e55" containerName="extract" Apr 22 18:01:23.410147 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.409930 2527 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d05e451a-217c-4be9-80e4-3e2840bc2e55" containerName="pull" Apr 22 18:01:23.410147 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.409935 2527 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05e451a-217c-4be9-80e4-3e2840bc2e55" containerName="pull" Apr 22 18:01:23.410147 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.409973 2527 memory_manager.go:356] "RemoveStaleState removing state" podUID="d05e451a-217c-4be9-80e4-3e2840bc2e55" containerName="extract" Apr 22 18:01:23.425046 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.425019 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-qvmpl" Apr 22 18:01:23.427430 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.427407 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 18:01:23.427697 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.427679 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:01:23.428480 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.428463 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:01:23.428745 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.428731 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:01:23.428806 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.428730 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-wnbnm\"" Apr 22 18:01:23.435201 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.435179 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-qvmpl"] Apr 22 18:01:23.461166 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.461133 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngz8m\" (UniqueName: \"kubernetes.io/projected/2175adf3-3814-465a-a3e3-d5fd894a2ec2-kube-api-access-ngz8m\") pod \"keda-admission-cf49989db-qvmpl\" (UID: \"2175adf3-3814-465a-a3e3-d5fd894a2ec2\") " pod="openshift-keda/keda-admission-cf49989db-qvmpl" Apr 22 18:01:23.461280 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.461168 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2175adf3-3814-465a-a3e3-d5fd894a2ec2-certificates\") pod \"keda-admission-cf49989db-qvmpl\" (UID: \"2175adf3-3814-465a-a3e3-d5fd894a2ec2\") " pod="openshift-keda/keda-admission-cf49989db-qvmpl" Apr 22 18:01:23.561617 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.561585 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngz8m\" (UniqueName: \"kubernetes.io/projected/2175adf3-3814-465a-a3e3-d5fd894a2ec2-kube-api-access-ngz8m\") pod \"keda-admission-cf49989db-qvmpl\" (UID: \"2175adf3-3814-465a-a3e3-d5fd894a2ec2\") " pod="openshift-keda/keda-admission-cf49989db-qvmpl" Apr 22 18:01:23.561780 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.561627 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2175adf3-3814-465a-a3e3-d5fd894a2ec2-certificates\") pod \"keda-admission-cf49989db-qvmpl\" (UID: \"2175adf3-3814-465a-a3e3-d5fd894a2ec2\") " pod="openshift-keda/keda-admission-cf49989db-qvmpl" Apr 22 18:01:23.561780 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:01:23.561728 2527 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 18:01:23.561780 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:01:23.561745 2527 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-qvmpl: secret "keda-admission-webhooks-certs" not found Apr 22 18:01:23.561906 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:01:23.561804 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2175adf3-3814-465a-a3e3-d5fd894a2ec2-certificates podName:2175adf3-3814-465a-a3e3-d5fd894a2ec2 nodeName:}" failed. No retries permitted until 2026-04-22 18:01:24.061786883 +0000 UTC m=+452.025075931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2175adf3-3814-465a-a3e3-d5fd894a2ec2-certificates") pod "keda-admission-cf49989db-qvmpl" (UID: "2175adf3-3814-465a-a3e3-d5fd894a2ec2") : secret "keda-admission-webhooks-certs" not found Apr 22 18:01:23.571257 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:23.571233 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngz8m\" (UniqueName: \"kubernetes.io/projected/2175adf3-3814-465a-a3e3-d5fd894a2ec2-kube-api-access-ngz8m\") pod \"keda-admission-cf49989db-qvmpl\" (UID: \"2175adf3-3814-465a-a3e3-d5fd894a2ec2\") " pod="openshift-keda/keda-admission-cf49989db-qvmpl" Apr 22 18:01:24.065786 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:24.065745 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2175adf3-3814-465a-a3e3-d5fd894a2ec2-certificates\") pod \"keda-admission-cf49989db-qvmpl\" (UID: \"2175adf3-3814-465a-a3e3-d5fd894a2ec2\") " pod="openshift-keda/keda-admission-cf49989db-qvmpl" Apr 22 18:01:24.068171 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:24.068142 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2175adf3-3814-465a-a3e3-d5fd894a2ec2-certificates\") pod \"keda-admission-cf49989db-qvmpl\" (UID: \"2175adf3-3814-465a-a3e3-d5fd894a2ec2\") " pod="openshift-keda/keda-admission-cf49989db-qvmpl" Apr 22 18:01:24.335510 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:24.335412 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-qvmpl" Apr 22 18:01:24.456884 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:24.456851 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-qvmpl"] Apr 22 18:01:24.462925 ip-10-0-135-21 kubenswrapper[2527]: W0422 18:01:24.462900 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2175adf3_3814_465a_a3e3_d5fd894a2ec2.slice/crio-6f05160a2f349ca4ca74458427262a8f4f7fbe2ca106ee3c35e2488edb411963 WatchSource:0}: Error finding container 6f05160a2f349ca4ca74458427262a8f4f7fbe2ca106ee3c35e2488edb411963: Status 404 returned error can't find the container with id 6f05160a2f349ca4ca74458427262a8f4f7fbe2ca106ee3c35e2488edb411963 Apr 22 18:01:24.969898 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:24.969862 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-qvmpl" event={"ID":"2175adf3-3814-465a-a3e3-d5fd894a2ec2","Type":"ContainerStarted","Data":"6f05160a2f349ca4ca74458427262a8f4f7fbe2ca106ee3c35e2488edb411963"} Apr 22 18:01:25.973191 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:25.973099 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-qvmpl" event={"ID":"2175adf3-3814-465a-a3e3-d5fd894a2ec2","Type":"ContainerStarted","Data":"7bceeef322d5f4d1cd078e7134aaa5486f8bce54e420ec73fc904b46c0fd68a3"} Apr 22 18:01:25.973191 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:25.973182 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-qvmpl" Apr 22 18:01:25.991336 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:25.991285 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-qvmpl" podStartSLOduration=1.754337604 podStartE2EDuration="2.991270665s" podCreationTimestamp="2026-04-22 18:01:23 +0000 UTC" firstStartedPulling="2026-04-22 18:01:24.463893761 +0000 UTC m=+452.427182812" lastFinishedPulling="2026-04-22 18:01:25.700826825 +0000 UTC m=+453.664115873" observedRunningTime="2026-04-22 18:01:25.989559993 +0000 UTC m=+453.952849066" watchObservedRunningTime="2026-04-22 18:01:25.991270665 +0000 UTC m=+453.954559752" Apr 22 18:01:46.977804 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:01:46.977768 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-qvmpl" Apr 22 18:02:28.731927 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.731889 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw"] Apr 22 18:02:28.734516 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.734496 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" Apr 22 18:02:28.736825 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.736804 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:02:28.736914 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.736812 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-7dbdm\"" Apr 22 18:02:28.737868 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.737848 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:02:28.737958 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.737918 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:02:28.743132 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.743109 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw"] Apr 22 18:02:28.832531 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.832490 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ffdd919-820a-411d-9779-6d22ac95b5ad-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ljzsw\" (UID: \"9ffdd919-820a-411d-9779-6d22ac95b5ad\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" Apr 22 18:02:28.832722 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.832553 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npn8m\" (UniqueName: \"kubernetes.io/projected/9ffdd919-820a-411d-9779-6d22ac95b5ad-kube-api-access-npn8m\") pod \"llmisvc-controller-manager-68cc5db7c4-ljzsw\" (UID: \"9ffdd919-820a-411d-9779-6d22ac95b5ad\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" Apr 22 18:02:28.933446 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.933363 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ffdd919-820a-411d-9779-6d22ac95b5ad-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ljzsw\" (UID: \"9ffdd919-820a-411d-9779-6d22ac95b5ad\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" Apr 22 18:02:28.933446 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.933408 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npn8m\" (UniqueName: \"kubernetes.io/projected/9ffdd919-820a-411d-9779-6d22ac95b5ad-kube-api-access-npn8m\") pod \"llmisvc-controller-manager-68cc5db7c4-ljzsw\" (UID: \"9ffdd919-820a-411d-9779-6d22ac95b5ad\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" Apr 22 18:02:28.933669 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:02:28.933509 2527 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 18:02:28.933669 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:02:28.933600 2527 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ffdd919-820a-411d-9779-6d22ac95b5ad-cert podName:9ffdd919-820a-411d-9779-6d22ac95b5ad nodeName:}" failed. No retries permitted until 2026-04-22 18:02:29.433584415 +0000 UTC m=+517.396873476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ffdd919-820a-411d-9779-6d22ac95b5ad-cert") pod "llmisvc-controller-manager-68cc5db7c4-ljzsw" (UID: "9ffdd919-820a-411d-9779-6d22ac95b5ad") : secret "llmisvc-webhook-server-cert" not found Apr 22 18:02:28.944224 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:28.944188 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npn8m\" (UniqueName: \"kubernetes.io/projected/9ffdd919-820a-411d-9779-6d22ac95b5ad-kube-api-access-npn8m\") pod \"llmisvc-controller-manager-68cc5db7c4-ljzsw\" (UID: \"9ffdd919-820a-411d-9779-6d22ac95b5ad\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" Apr 22 18:02:29.438197 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:29.438167 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ffdd919-820a-411d-9779-6d22ac95b5ad-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ljzsw\" (UID: \"9ffdd919-820a-411d-9779-6d22ac95b5ad\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" Apr 22 18:02:29.440517 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:29.440494 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ffdd919-820a-411d-9779-6d22ac95b5ad-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ljzsw\" (UID: \"9ffdd919-820a-411d-9779-6d22ac95b5ad\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" Apr 22 18:02:29.645234 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:29.645191 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" Apr 22 18:02:29.761727 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:29.761688 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw"] Apr 22 18:02:29.764490 ip-10-0-135-21 kubenswrapper[2527]: W0422 18:02:29.764466 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9ffdd919_820a_411d_9779_6d22ac95b5ad.slice/crio-401304cc343e716cf8a302c2f8052bd498033479a5580ccdcdea4c563fdeb97a WatchSource:0}: Error finding container 401304cc343e716cf8a302c2f8052bd498033479a5580ccdcdea4c563fdeb97a: Status 404 returned error can't find the container with id 401304cc343e716cf8a302c2f8052bd498033479a5580ccdcdea4c563fdeb97a Apr 22 18:02:30.137390 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:30.137356 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" event={"ID":"9ffdd919-820a-411d-9779-6d22ac95b5ad","Type":"ContainerStarted","Data":"401304cc343e716cf8a302c2f8052bd498033479a5580ccdcdea4c563fdeb97a"} Apr 22 18:02:32.144475 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:32.144436 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" event={"ID":"9ffdd919-820a-411d-9779-6d22ac95b5ad","Type":"ContainerStarted","Data":"246810dae8659477d5edd7bd27b91334bb48a5b0aa230ec0e326f9bb558233b0"} Apr 22 18:02:32.144923 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:32.144585 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" Apr 22 18:02:32.161111 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:02:32.161063 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" podStartSLOduration=2.288967358 podStartE2EDuration="4.161049641s" podCreationTimestamp="2026-04-22 18:02:28 +0000 UTC" firstStartedPulling="2026-04-22 18:02:29.766158963 +0000 UTC m=+517.729448014" lastFinishedPulling="2026-04-22 18:02:31.638241246 +0000 UTC m=+519.601530297" observedRunningTime="2026-04-22 18:02:32.159426937 +0000 UTC m=+520.122716007" watchObservedRunningTime="2026-04-22 18:02:32.161049641 +0000 UTC m=+520.124338745" Apr 22 18:03:03.149765 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:03.149678 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ljzsw" Apr 22 18:03:37.477893 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.477847 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-smnz4"] Apr 22 18:03:37.481008 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.480988 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-smnz4" Apr 22 18:03:37.484611 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.484588 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 18:03:37.484611 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.484607 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-t7zj8\"" Apr 22 18:03:37.492716 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.492697 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-smnz4"] Apr 22 18:03:37.537686 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.537642 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lblc5\" (UniqueName: \"kubernetes.io/projected/b53e2c2a-b246-4e64-b487-8bc267d2f7ad-kube-api-access-lblc5\") pod \"model-serving-api-86f7b4b499-smnz4\" (UID: \"b53e2c2a-b246-4e64-b487-8bc267d2f7ad\") " pod="kserve/model-serving-api-86f7b4b499-smnz4" Apr 22 18:03:37.537686 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.537696 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b53e2c2a-b246-4e64-b487-8bc267d2f7ad-tls-certs\") pod \"model-serving-api-86f7b4b499-smnz4\" (UID: \"b53e2c2a-b246-4e64-b487-8bc267d2f7ad\") " pod="kserve/model-serving-api-86f7b4b499-smnz4" Apr 22 18:03:37.638806 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.638768 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lblc5\" (UniqueName: \"kubernetes.io/projected/b53e2c2a-b246-4e64-b487-8bc267d2f7ad-kube-api-access-lblc5\") pod \"model-serving-api-86f7b4b499-smnz4\" (UID: \"b53e2c2a-b246-4e64-b487-8bc267d2f7ad\") " pod="kserve/model-serving-api-86f7b4b499-smnz4" Apr 22 18:03:37.638936 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.638813 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b53e2c2a-b246-4e64-b487-8bc267d2f7ad-tls-certs\") pod \"model-serving-api-86f7b4b499-smnz4\" (UID: \"b53e2c2a-b246-4e64-b487-8bc267d2f7ad\") " pod="kserve/model-serving-api-86f7b4b499-smnz4" Apr 22 18:03:37.641120 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.641091 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b53e2c2a-b246-4e64-b487-8bc267d2f7ad-tls-certs\") pod \"model-serving-api-86f7b4b499-smnz4\" (UID: \"b53e2c2a-b246-4e64-b487-8bc267d2f7ad\") " pod="kserve/model-serving-api-86f7b4b499-smnz4" Apr 22 18:03:37.648062 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.648028 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lblc5\" (UniqueName: \"kubernetes.io/projected/b53e2c2a-b246-4e64-b487-8bc267d2f7ad-kube-api-access-lblc5\") pod \"model-serving-api-86f7b4b499-smnz4\" (UID: \"b53e2c2a-b246-4e64-b487-8bc267d2f7ad\") " pod="kserve/model-serving-api-86f7b4b499-smnz4" Apr 22 18:03:37.790486 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.790453 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-smnz4" Apr 22 18:03:37.911607 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:37.911562 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-smnz4"] Apr 22 18:03:37.914405 ip-10-0-135-21 kubenswrapper[2527]: W0422 18:03:37.914373 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53e2c2a_b246_4e64_b487_8bc267d2f7ad.slice/crio-37a092d1d0f61fe1337bca622ced417f3bc7f06e1890842473023cad17ac3c9f WatchSource:0}: Error finding container 37a092d1d0f61fe1337bca622ced417f3bc7f06e1890842473023cad17ac3c9f: Status 404 returned error can't find the container with id 37a092d1d0f61fe1337bca622ced417f3bc7f06e1890842473023cad17ac3c9f Apr 22 18:03:38.320812 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:38.320775 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-smnz4" event={"ID":"b53e2c2a-b246-4e64-b487-8bc267d2f7ad","Type":"ContainerStarted","Data":"37a092d1d0f61fe1337bca622ced417f3bc7f06e1890842473023cad17ac3c9f"} Apr 22 18:03:40.136182 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:03:40.136140 2527 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:3eada63909d6f6b6771f9b90a3646d2f225eb56e220dd4614fba7a05a10dfa12: fetching blob: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image" image="quay.io/opendatahub/odh-model-controller:odh-model-serving-api-fast" Apr 22 18:03:40.136585 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:03:40.136341 2527 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-controller:odh-model-serving-api-fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{1073741824 0} {} 1Gi BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lblc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-86f7b4b499-smnz4_kserve(b53e2c2a-b246-4e64-b487-8bc267d2f7ad): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:3eada63909d6f6b6771f9b90a3646d2f225eb56e220dd4614fba7a05a10dfa12: fetching blob: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 22 18:03:40.137516 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:03:40.137489 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:3eada63909d6f6b6771f9b90a3646d2f225eb56e220dd4614fba7a05a10dfa12: fetching blob: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image\"" pod="kserve/model-serving-api-86f7b4b499-smnz4" podUID="b53e2c2a-b246-4e64-b487-8bc267d2f7ad" Apr 22 18:03:40.328722 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:03:40.328689 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-controller:odh-model-serving-api-fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: reading blob sha256:3eada63909d6f6b6771f9b90a3646d2f225eb56e220dd4614fba7a05a10dfa12: fetching blob: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image\"" pod="kserve/model-serving-api-86f7b4b499-smnz4" podUID="b53e2c2a-b246-4e64-b487-8bc267d2f7ad" Apr 22 18:03:52.205544 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:03:52.205493 2527 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-model-controller:odh-model-serving-api-fast: reading manifest sha256:0141deb943a2247c25b2e898e420f7d60323195e9fb799e4db3a115ae873daf8 in quay.io/opendatahub/odh-model-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image" image="quay.io/opendatahub/odh-model-controller:odh-model-serving-api-fast" Apr 22 18:03:52.206013 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:03:52.205761 2527 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-controller:odh-model-serving-api-fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{1073741824 0} {} 1Gi BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lblc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-86f7b4b499-smnz4_kserve(b53e2c2a-b246-4e64-b487-8bc267d2f7ad): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-model-controller:odh-model-serving-api-fast: reading manifest sha256:0141deb943a2247c25b2e898e420f7d60323195e9fb799e4db3a115ae873daf8 in quay.io/opendatahub/odh-model-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 22 18:03:52.206967 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:03:52.206935 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-model-controller:odh-model-serving-api-fast: reading manifest sha256:0141deb943a2247c25b2e898e420f7d60323195e9fb799e4db3a115ae873daf8 in quay.io/opendatahub/odh-model-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image\"" pod="kserve/model-serving-api-86f7b4b499-smnz4" podUID="b53e2c2a-b246-4e64-b487-8bc267d2f7ad" Apr 22 18:03:52.589585 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:52.589541 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:03:52.592238 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:03:52.592215 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:04:02.697854 ip-10-0-135-21 kubenswrapper[2527]: E0422 18:04:02.697824 2527 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-controller:odh-model-serving-api-fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-model-controller:odh-model-serving-api-fast: reading manifest sha256:0141deb943a2247c25b2e898e420f7d60323195e9fb799e4db3a115ae873daf8 in quay.io/opendatahub/odh-model-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image\"" pod="kserve/model-serving-api-86f7b4b499-smnz4" podUID="b53e2c2a-b246-4e64-b487-8bc267d2f7ad" Apr 22 18:04:13.090285 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:13.090201 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd"] Apr 22 18:04:13.093371 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:13.093351 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" Apr 22 18:04:13.095757 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:13.095738 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9chzd\"" Apr 22 18:04:13.100611 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:13.100590 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd"] Apr 22 18:04:13.103800 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:13.103781 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" Apr 22 18:04:13.220345 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:13.220244 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd"] Apr 22 18:04:13.223003 ip-10-0-135-21 kubenswrapper[2527]: W0422 18:04:13.222975 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fa10e39_c481_48df_8b5d_bacd53bf0bbf.slice/crio-4cf77755a7d17edbfb3a77af21c4d172b8b6c4c4fc30a4b3d432740dd6e8bee7 WatchSource:0}: Error finding container 4cf77755a7d17edbfb3a77af21c4d172b8b6c4c4fc30a4b3d432740dd6e8bee7: Status 404 returned error can't find the container with id 4cf77755a7d17edbfb3a77af21c4d172b8b6c4c4fc30a4b3d432740dd6e8bee7 Apr 22 18:04:13.417921 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:13.417829 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" event={"ID":"8fa10e39-c481-48df-8b5d-bacd53bf0bbf","Type":"ContainerStarted","Data":"4cf77755a7d17edbfb3a77af21c4d172b8b6c4c4fc30a4b3d432740dd6e8bee7"} Apr 22 18:04:25.463180 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:25.463141 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-smnz4" event={"ID":"b53e2c2a-b246-4e64-b487-8bc267d2f7ad","Type":"ContainerStarted","Data":"6e3129fe19e1d803cb17978d95f07e69a7f0bd18c800dd305a793a8894f1583b"} Apr 22 18:04:25.463658 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:25.463423 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-smnz4" Apr 22 18:04:25.464451 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:25.464424 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" event={"ID":"8fa10e39-c481-48df-8b5d-bacd53bf0bbf","Type":"ContainerStarted","Data":"8d41368a569e8943ddfe1e4f5f4693d6b3e9164410f60355f3e070fe2b641856"} Apr 22 18:04:25.464616 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:25.464600 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" Apr 22 18:04:25.465948 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:25.465924 2527 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" podUID="8fa10e39-c481-48df-8b5d-bacd53bf0bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 18:04:25.480074 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:25.480033 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-smnz4" podStartSLOduration=1.590131299 podStartE2EDuration="48.480022905s" podCreationTimestamp="2026-04-22 18:03:37 +0000 UTC" firstStartedPulling="2026-04-22 18:03:37.916153983 +0000 UTC m=+585.879443032" lastFinishedPulling="2026-04-22 18:04:24.806045588 +0000 UTC m=+632.769334638" observedRunningTime="2026-04-22 18:04:25.478972293 +0000 UTC m=+633.442261360" watchObservedRunningTime="2026-04-22 18:04:25.480022905 +0000 UTC m=+633.443311974" Apr 22 18:04:25.494952 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:25.494913 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" podStartSLOduration=0.87190208 podStartE2EDuration="12.494902052s" podCreationTimestamp="2026-04-22 18:04:13 +0000 UTC" firstStartedPulling="2026-04-22 18:04:13.224847283 +0000 UTC m=+621.188136331" lastFinishedPulling="2026-04-22 18:04:24.84784724 +0000 UTC m=+632.811136303" observedRunningTime="2026-04-22 18:04:25.493343419 +0000 UTC m=+633.456632488" watchObservedRunningTime="2026-04-22 18:04:25.494902052 +0000 UTC m=+633.458191122" Apr 22 18:04:26.467990 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:26.467954 2527 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" podUID="8fa10e39-c481-48df-8b5d-bacd53bf0bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 18:04:36.468714 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:36.468660 2527 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" podUID="8fa10e39-c481-48df-8b5d-bacd53bf0bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 18:04:36.473293 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:36.473272 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-smnz4" Apr 22 18:04:46.468946 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:46.468893 2527 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" podUID="8fa10e39-c481-48df-8b5d-bacd53bf0bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 18:04:56.468390 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:04:56.468338 2527 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" podUID="8fa10e39-c481-48df-8b5d-bacd53bf0bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 18:05:06.468205 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:05:06.468153 2527 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" podUID="8fa10e39-c481-48df-8b5d-bacd53bf0bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 18:05:16.469768 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:05:16.469734 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a9320-predictor-86967d664-62wwd" Apr 22 18:08:52.610331 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:08:52.610250 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:08:52.613780 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:08:52.613760 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:13:52.628793 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:13:52.628761 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:13:52.633237 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:13:52.633218 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:18:52.648554 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:18:52.648426 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:18:52.655431 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:18:52.655408 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:23:52.668148 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:23:52.668032 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:23:52.675080 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:23:52.675058 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:28:52.686493 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:28:52.686381 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:28:52.697241 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:28:52.697218 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:33:52.709689 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:33:52.709585 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:33:52.717465 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:33:52.717439 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:38:52.728918 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:38:52.728811 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:38:52.736394 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:38:52.736373 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:43:52.747384 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:43:52.747276 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:43:52.754835 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:43:52.754817 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:48:52.766490 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:48:52.766463 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:48:52.773643 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:48:52.773621 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:53:52.785004 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:53:52.784893 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:53:52.792838 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:53:52.792806 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:58:52.803772 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:58:52.803650 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 18:58:52.812718 ip-10-0-135-21 kubenswrapper[2527]: I0422 18:58:52.812697 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 19:03:52.822127 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:03:52.822025 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 19:03:52.831503 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:03:52.831478 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 19:04:32.204772 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:32.204690 2527 ???:1] "http: TLS handshake error from 10.0.143.11:34756: EOF" Apr 22 19:04:32.207317 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:32.207296 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:32.682552 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:32.682518 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:33.151926 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:33.151896 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:33.623551 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:33.623523 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:34.083059 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:34.083028 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:34.535724 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:34.535700 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:34.981378 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:34.981293 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:35.435156 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:35.435127 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:35.877946 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:35.877912 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:36.322785 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:36.322756 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:36.765539 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:36.765511 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:37.217745 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:37.217654 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:37.659198 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:37.659174 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:38.132806 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:38.132779 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:38.612207 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:38.612185 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:39.112054 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:39.112025 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:39.725406 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:39.725377 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:40.174238 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:40.174209 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:40.608743 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:40.608716 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:41.056772 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:41.056748 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:41.526751 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:41.526728 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:41.992946 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:41.992869 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:42.445982 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:42.445955 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:42.903860 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:42.903832 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:43.353490 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:43.353463 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:43.816806 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:43.816779 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:44.323422 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:44.323395 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:44.790533 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:44.790504 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:45.255839 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:45.255819 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:45.692236 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:45.692157 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:46.173407 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:46.173377 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:46.608707 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:46.608681 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:47.045444 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:47.045413 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:47.507151 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:47.507126 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:47.943772 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:47.943691 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-a9320-predictor-86967d664-62wwd_8fa10e39-c481-48df-8b5d-bacd53bf0bbf/kserve-container/0.log" Apr 22 19:04:49.825599 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:49.825546 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-862r5/must-gather-szz9f"] Apr 22 19:04:49.828694 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:49.828676 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-862r5/must-gather-szz9f" Apr 22 19:04:49.831199 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:49.831179 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-862r5\"/\"kube-root-ca.crt\"" Apr 22 19:04:49.831314 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:49.831221 2527 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-862r5\"/\"openshift-service-ca.crt\"" Apr 22 19:04:49.832177 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:49.832163 2527 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-862r5\"/\"default-dockercfg-dzdgq\"" Apr 22 19:04:49.835799 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:49.835633 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-862r5/must-gather-szz9f"] Apr 22 19:04:50.006885 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:50.006840 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fbd596ae-e1ee-4db7-9a6d-d22de94cded9-must-gather-output\") pod \"must-gather-szz9f\" (UID: \"fbd596ae-e1ee-4db7-9a6d-d22de94cded9\") " pod="openshift-must-gather-862r5/must-gather-szz9f" Apr 22 19:04:50.007069 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:50.006905 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwlb4\" (UniqueName: \"kubernetes.io/projected/fbd596ae-e1ee-4db7-9a6d-d22de94cded9-kube-api-access-bwlb4\") pod \"must-gather-szz9f\" (UID: \"fbd596ae-e1ee-4db7-9a6d-d22de94cded9\") " pod="openshift-must-gather-862r5/must-gather-szz9f" Apr 22 19:04:50.108159 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:50.108068 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwlb4\" (UniqueName: \"kubernetes.io/projected/fbd596ae-e1ee-4db7-9a6d-d22de94cded9-kube-api-access-bwlb4\") pod \"must-gather-szz9f\" (UID: \"fbd596ae-e1ee-4db7-9a6d-d22de94cded9\") " pod="openshift-must-gather-862r5/must-gather-szz9f" Apr 22 19:04:50.108159 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:50.108145 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fbd596ae-e1ee-4db7-9a6d-d22de94cded9-must-gather-output\") pod \"must-gather-szz9f\" (UID: \"fbd596ae-e1ee-4db7-9a6d-d22de94cded9\") " pod="openshift-must-gather-862r5/must-gather-szz9f" Apr 22 19:04:50.108445 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:50.108431 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fbd596ae-e1ee-4db7-9a6d-d22de94cded9-must-gather-output\") pod \"must-gather-szz9f\" (UID: \"fbd596ae-e1ee-4db7-9a6d-d22de94cded9\") " pod="openshift-must-gather-862r5/must-gather-szz9f" Apr 22 19:04:50.116551 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:50.116530 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwlb4\" (UniqueName: \"kubernetes.io/projected/fbd596ae-e1ee-4db7-9a6d-d22de94cded9-kube-api-access-bwlb4\") pod \"must-gather-szz9f\" (UID: \"fbd596ae-e1ee-4db7-9a6d-d22de94cded9\") " pod="openshift-must-gather-862r5/must-gather-szz9f" Apr 22 19:04:50.138179 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:50.138152 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-862r5/must-gather-szz9f" Apr 22 19:04:50.257764 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:50.257725 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-862r5/must-gather-szz9f"] Apr 22 19:04:50.260671 ip-10-0-135-21 kubenswrapper[2527]: W0422 19:04:50.260638 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbd596ae_e1ee_4db7_9a6d_d22de94cded9.slice/crio-46aa7158849f353972cd2165d24d65b70d49e63cf57dad8e040f282e7a8c4a94 WatchSource:0}: Error finding container 46aa7158849f353972cd2165d24d65b70d49e63cf57dad8e040f282e7a8c4a94: Status 404 returned error can't find the container with id 46aa7158849f353972cd2165d24d65b70d49e63cf57dad8e040f282e7a8c4a94 Apr 22 19:04:50.262543 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:50.262529 2527 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:04:50.310913 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:50.310877 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-862r5/must-gather-szz9f" event={"ID":"fbd596ae-e1ee-4db7-9a6d-d22de94cded9","Type":"ContainerStarted","Data":"46aa7158849f353972cd2165d24d65b70d49e63cf57dad8e040f282e7a8c4a94"} Apr 22 19:04:51.316211 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:51.316153 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-862r5/must-gather-szz9f" event={"ID":"fbd596ae-e1ee-4db7-9a6d-d22de94cded9","Type":"ContainerStarted","Data":"f94d97bdb99784ef5997b3d7dce99d26c0713a9c1cbe643abb4095d7a2aa8ade"} Apr 22 19:04:52.321896 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:52.321857 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-862r5/must-gather-szz9f" event={"ID":"fbd596ae-e1ee-4db7-9a6d-d22de94cded9","Type":"ContainerStarted","Data":"f86fdc517e740da83cc43ade3dd4587bd7ab657fa4753154272076c8b0fd05f8"} Apr 22 19:04:52.337864 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:52.337813 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-862r5/must-gather-szz9f" podStartSLOduration=2.467068579 podStartE2EDuration="3.337797171s" podCreationTimestamp="2026-04-22 19:04:49 +0000 UTC" firstStartedPulling="2026-04-22 19:04:50.262680076 +0000 UTC m=+4258.225969124" lastFinishedPulling="2026-04-22 19:04:51.133408667 +0000 UTC m=+4259.096697716" observedRunningTime="2026-04-22 19:04:52.336953155 +0000 UTC m=+4260.300242231" watchObservedRunningTime="2026-04-22 19:04:52.337797171 +0000 UTC m=+4260.301086240" Apr 22 19:04:52.579770 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:52.579694 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rh7h6_4424ac43-8016-4c60-8b3b-e1b48a277a63/global-pull-secret-syncer/0.log" Apr 22 19:04:52.621549 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:52.621519 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9z422_22e8e337-8adf-4261-832e-41ba662e7747/konnectivity-agent/0.log" Apr 22 19:04:52.760531 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:52.760504 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-21.ec2.internal_7c86caefdc1dda70cffe5878d33236b6/haproxy/0.log" Apr 22 19:04:56.195420 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:56.195393 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-9b5c68cb9-8dr7k_7c9a4cc5-cc72-4918-aef8-6fb7a25e791a/metrics-server/0.log" Apr 22 19:04:56.220850 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:56.220818 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-kn55x_f83efb09-abff-4bea-897d-73a5f78393b6/monitoring-plugin/0.log" Apr 22 19:04:56.335260 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:56.335220 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-955cw_cca0b693-77d9-4958-aa18-7035cae26b0f/node-exporter/0.log" Apr 22 19:04:56.357591 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:56.357543 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-955cw_cca0b693-77d9-4958-aa18-7035cae26b0f/kube-rbac-proxy/0.log" Apr 22 19:04:56.378640 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:56.378607 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-955cw_cca0b693-77d9-4958-aa18-7035cae26b0f/init-textfile/0.log" Apr 22 19:04:56.491713 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:56.491649 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b9dt5_e5593f1c-81f8-4b31-99a7-9d09d22babb0/kube-rbac-proxy-main/0.log" Apr 22 19:04:56.516752 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:56.516722 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b9dt5_e5593f1c-81f8-4b31-99a7-9d09d22babb0/kube-rbac-proxy-self/0.log" Apr 22 19:04:56.538506 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:56.538475 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b9dt5_e5593f1c-81f8-4b31-99a7-9d09d22babb0/openshift-state-metrics/0.log" Apr 22 19:04:56.794885 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:56.794852 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-7zlkb_359d1847-f848-44fb-bc84-c8d778791fa3/prometheus-operator-admission-webhook/0.log" Apr 22 19:04:59.721281 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.721247 2527 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4"] Apr 22 19:04:59.725832 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.725810 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.733950 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.733919 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4"] Apr 22 19:04:59.790699 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.790656 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-lib-modules\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.790888 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.790809 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-sys\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.790888 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.790851 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krb2h\" (UniqueName: \"kubernetes.io/projected/4f9d1cd5-9779-4839-b784-86df7f15fe4e-kube-api-access-krb2h\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.791002 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.790893 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-podres\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.791002 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.790916 2527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-proc\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.891629 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.891592 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-sys\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.891629 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.891634 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krb2h\" (UniqueName: \"kubernetes.io/projected/4f9d1cd5-9779-4839-b784-86df7f15fe4e-kube-api-access-krb2h\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.891888 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.891666 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-podres\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.891888 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.891689 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-proc\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.891888 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.891723 2527 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-lib-modules\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.891888 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.891866 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-lib-modules\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.892104 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.891950 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-podres\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.892104 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.892005 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-proc\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.892104 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.892018 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f9d1cd5-9779-4839-b784-86df7f15fe4e-sys\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:04:59.900809 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:04:59.900779 2527 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krb2h\" (UniqueName: \"kubernetes.io/projected/4f9d1cd5-9779-4839-b784-86df7f15fe4e-kube-api-access-krb2h\") pod \"perf-node-gather-daemonset-jz4d4\" (UID: \"4f9d1cd5-9779-4839-b784-86df7f15fe4e\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:05:00.038393 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:00.038344 2527 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:05:00.167834 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:00.167794 2527 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4"] Apr 22 19:05:00.171110 ip-10-0-135-21 kubenswrapper[2527]: W0422 19:05:00.171077 2527 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4f9d1cd5_9779_4839_b784_86df7f15fe4e.slice/crio-c872b0dc8b467e29511c863b0f6cc208d674e0e8dc1385a6d79e64d9cbb5e72f WatchSource:0}: Error finding container c872b0dc8b467e29511c863b0f6cc208d674e0e8dc1385a6d79e64d9cbb5e72f: Status 404 returned error can't find the container with id c872b0dc8b467e29511c863b0f6cc208d674e0e8dc1385a6d79e64d9cbb5e72f Apr 22 19:05:00.310879 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:00.310798 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-24hgm_3668ec4c-282c-4be6-a5d4-90efe74d754b/dns/0.log" Apr 22 19:05:00.337741 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:00.337708 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-24hgm_3668ec4c-282c-4be6-a5d4-90efe74d754b/kube-rbac-proxy/0.log" Apr 22 19:05:00.349549 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:00.349519 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" event={"ID":"4f9d1cd5-9779-4839-b784-86df7f15fe4e","Type":"ContainerStarted","Data":"a7f5ec708b12d0f40c760ddc66784d1804f6e03384993323a8326e8855546c86"} Apr 22 19:05:00.349712 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:00.349555 2527 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" event={"ID":"4f9d1cd5-9779-4839-b784-86df7f15fe4e","Type":"ContainerStarted","Data":"c872b0dc8b467e29511c863b0f6cc208d674e0e8dc1385a6d79e64d9cbb5e72f"} Apr 22 19:05:00.349712 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:00.349605 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:05:00.372223 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:00.372178 2527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" podStartSLOduration=1.3721649 podStartE2EDuration="1.3721649s" podCreationTimestamp="2026-04-22 19:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:05:00.370010211 +0000 UTC m=+4268.333299282" watchObservedRunningTime="2026-04-22 19:05:00.3721649 +0000 UTC m=+4268.335453969" Apr 22 19:05:00.531939 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:00.531912 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qqd9n_71ff672a-e8ff-4269-8be1-4e451a7046df/dns-node-resolver/0.log" Apr 22 19:05:01.121286 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:01.121236 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zgpsv_bea09b4b-5a52-4d36-968d-10c68f944bee/node-ca/0.log" Apr 22 19:05:02.242064 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:02.242031 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jlfdb_19df5bb9-b377-4d57-ac3d-bdf63a378385/serve-healthcheck-canary/0.log" Apr 22 19:05:02.709149 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:02.709118 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-97npl_cbbe9b18-4b0b-48db-aed3-04d33d5da7ba/kube-rbac-proxy/0.log" Apr 22 19:05:02.731840 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:02.731813 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-97npl_cbbe9b18-4b0b-48db-aed3-04d33d5da7ba/exporter/0.log" Apr 22 19:05:02.753878 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:02.753845 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-97npl_cbbe9b18-4b0b-48db-aed3-04d33d5da7ba/extractor/0.log" Apr 22 19:05:04.781680 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:04.781655 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-ljzsw_9ffdd919-820a-411d-9779-6d22ac95b5ad/manager/0.log" Apr 22 19:05:04.803392 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:04.803369 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-smnz4_b53e2c2a-b246-4e64-b487-8bc267d2f7ad/server/0.log" Apr 22 19:05:06.362247 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:06.362221 2527 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-jz4d4" Apr 22 19:05:10.723193 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:10.723167 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7h2l_5e78c417-84fe-4436-91ea-459e917e9154/kube-multus-additional-cni-plugins/0.log" Apr 22 19:05:10.745449 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:10.745423 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7h2l_5e78c417-84fe-4436-91ea-459e917e9154/egress-router-binary-copy/0.log" Apr 22 19:05:10.768994 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:10.768968 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7h2l_5e78c417-84fe-4436-91ea-459e917e9154/cni-plugins/0.log" Apr 22 19:05:10.789091 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:10.789065 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7h2l_5e78c417-84fe-4436-91ea-459e917e9154/bond-cni-plugin/0.log" Apr 22 19:05:10.816123 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:10.816088 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7h2l_5e78c417-84fe-4436-91ea-459e917e9154/routeoverride-cni/0.log" Apr 22 19:05:10.836438 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:10.836413 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7h2l_5e78c417-84fe-4436-91ea-459e917e9154/whereabouts-cni-bincopy/0.log" Apr 22 19:05:10.857393 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:10.857370 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q7h2l_5e78c417-84fe-4436-91ea-459e917e9154/whereabouts-cni/0.log" Apr 22 19:05:11.112390 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:11.112360 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hv7bb_afaf22ec-e8aa-448d-8007-91b393b66c96/kube-multus/0.log" Apr 22 19:05:11.178328 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:11.178286 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4pcg6_2e2c7888-11c8-4eb9-af23-d748c79bd567/network-metrics-daemon/0.log" Apr 22 19:05:11.199381 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:11.199356 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4pcg6_2e2c7888-11c8-4eb9-af23-d748c79bd567/kube-rbac-proxy/0.log" Apr 22 19:05:11.932979 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:11.932953 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-controller/0.log" Apr 22 19:05:11.952901 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:11.952848 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/0.log" Apr 22 19:05:11.973647 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:11.973626 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovn-acl-logging/1.log" Apr 22 19:05:11.992822 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:11.992794 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/kube-rbac-proxy-node/0.log" Apr 22 19:05:12.015834 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:12.015813 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:05:12.032381 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:12.032350 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/northd/0.log" Apr 22 19:05:12.054824 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:12.054803 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/nbdb/0.log" Apr 22 19:05:12.074212 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:12.074193 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/sbdb/0.log" Apr 22 19:05:12.187540 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:12.187462 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2hklb_b7681f6e-b80a-4387-bdf9-03efc8cdbd1c/ovnkube-controller/0.log" Apr 22 19:05:13.959284 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:13.959252 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qbfw5_758f2892-7f0d-40ae-a65b-7d7ce661548b/network-check-target-container/0.log" Apr 22 19:05:14.882425 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:14.882398 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-9sxr8_78f93c71-12e7-43ba-9997-9357c2fcdd1c/iptables-alerter/0.log" Apr 22 19:05:15.585409 ip-10-0-135-21 kubenswrapper[2527]: I0422 19:05:15.585374 2527 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-k5st4_720f417e-e0c4-40a9-a38a-c8e1907dbe5b/tuned/0.log"