Apr 22 18:33:50.667863 ip-10-0-132-204 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:33:50.667874 ip-10-0-132-204 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:33:50.667882 ip-10-0-132-204 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:33:50.668160 ip-10-0-132-204 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:34:00.794690 ip-10-0-132-204 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:34:00.794705 ip-10-0-132-204 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot bbee1c90963f4bc5808e6b9690122dd9 -- Apr 22 18:36:13.860965 ip-10-0-132-204 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:36:14.335240 ip-10-0-132-204 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:14.335240 ip-10-0-132-204 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:36:14.335240 ip-10-0-132-204 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:14.335240 ip-10-0-132-204 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:36:14.335240 ip-10-0-132-204 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:14.339080 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.338979 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:36:14.343212 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343193 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:14.343212 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343212 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343216 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343220 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343223 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343226 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343230 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343233 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343235 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343238 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343241 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343244 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343247 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343249 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343252 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343254 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343257 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343260 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343262 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343265 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343268 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:14.343281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343276 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343279 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343282 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343285 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343288 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343291 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343294 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343296 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343299 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343302 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343304 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343307 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343310 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343312 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343315 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343318 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343321 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343323 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343326 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:14.343767 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343328 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343331 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343333 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343336 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343338 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343341 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343343 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343346 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343349 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343352 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343354 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343357 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343361 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343364 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343367 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343369 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343372 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343375 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343379 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343381 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:14.344220 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343384 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343386 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343389 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343391 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343396 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343400 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343403 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343406 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343409 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343411 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343414 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343417 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343420 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343423 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343426 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343429 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343432 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343434 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343438 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:14.344766 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343442 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343446 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343449 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343452 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343456 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343459 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343461 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343915 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343921 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343925 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343928 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343931 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343934 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343937 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343940 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343943 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343945 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343948 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343950 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343953 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:14.345223 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343956 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343959 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343961 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343964 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343967 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343970 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343973 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343976 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343978 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343981 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343983 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343986 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343989 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343992 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343994 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.343997 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344000 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344002 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344005 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344008 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:14.345723 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344013 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344015 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344018 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344021 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344023 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344026 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344028 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344032 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344034 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344037 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344039 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344042 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344044 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344047 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344050 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344052 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344057 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344061 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344064 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:14.346304 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344067 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344070 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344073 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344077 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344080 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344082 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344085 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344088 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344090 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344093 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344095 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344098 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344101 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344104 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344108 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344111 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344113 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344116 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344118 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344121 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:14.346791 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344123 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344127 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344130 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344133 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344135 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344138 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344141 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344143 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344146 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344149 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344152 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344154 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344157 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344159 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344239 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344247 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344253 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344258 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344262 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344265 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344270 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:36:14.347281 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344275 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344278 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344281 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344285 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344288 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344292 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344295 2577 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344299 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344302 2577 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344304 2577 flags.go:64] FLAG: --cloud-config="" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344308 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344311 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344316 2577 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344319 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344322 2577 flags.go:64] FLAG: --config-dir="" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344325 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344328 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344332 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344336 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344339 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344342 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344346 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344348 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344352 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344355 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:36:14.347802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344358 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344362 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344365 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344368 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344371 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344374 2577 flags.go:64] FLAG: --enable-server="true" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344377 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344382 2577 flags.go:64] FLAG: --event-burst="100" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344385 2577 flags.go:64] FLAG: --event-qps="50" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344388 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344391 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344394 2577 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344398 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344401 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344405 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344408 2577 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344411 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344413 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344416 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344419 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344422 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344425 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344428 2577 flags.go:64] FLAG: --feature-gates="" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344432 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344435 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:36:14.348387 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344439 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344442 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344445 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344448 2577 flags.go:64] FLAG: --help="false" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344451 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344455 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344458 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344462 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344465 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344469 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344473 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344475 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344478 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344481 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344484 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344488 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344491 2577 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344494 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344497 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344500 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344502 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344505 2577 flags.go:64] FLAG: --lock-file="" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344509 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344512 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:36:14.348985 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344515 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344521 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344524 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344527 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344530 2577 flags.go:64] FLAG: --logging-format="text" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344532 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344536 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344539 2577 flags.go:64] FLAG: --manifest-url="" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344542 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344546 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344550 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344554 2577 flags.go:64] FLAG: --max-pods="110" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344557 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344560 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344564 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344567 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344570 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344573 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344576 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344584 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344586 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344589 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344593 2577 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:36:14.349613 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344596 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344602 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344605 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344608 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344611 2577 flags.go:64] FLAG: --port="10250" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344614 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344617 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e77dd9afd3299e01" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344620 2577 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344624 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344627 2577 flags.go:64] FLAG: --register-node="true" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344630 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344633 2577 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344637 2577 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344640 2577 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344643 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344646 2577 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344662 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344666 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344669 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344672 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344675 2577 flags.go:64] FLAG: --runonce="false" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344678 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344682 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344685 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344688 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344692 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:36:14.350170 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344695 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344698 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344702 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344705 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344708 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344711 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344714 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344717 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344720 2577 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344723 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344750 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344754 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344757 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344762 2577 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344765 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344769 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344776 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344780 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344783 2577 flags.go:64] FLAG: --v="2" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344787 2577 flags.go:64] FLAG: --version="false" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344792 2577 flags.go:64] FLAG: --vmodule="" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344797 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.344801 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344897 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:14.350800 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344900 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344904 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344907 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344910 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344914 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344919 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344922 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344925 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344928 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344930 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344933 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344937 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344940 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344944 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344947 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344950 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344952 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344955 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344958 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:14.351381 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344962 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344965 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344967 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344970 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344973 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344978 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344980 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344983 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344986 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344989 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344992 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344994 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344997 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.344999 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345003 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345006 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345009 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345012 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345014 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345017 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:14.351863 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345019 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345022 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345025 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345027 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345030 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345033 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345035 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345038 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345040 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345043 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345045 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345048 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345050 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345053 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345055 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345058 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345060 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345064 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345067 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:14.352356 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345070 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345072 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345075 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345078 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345080 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345083 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345085 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345089 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345092 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345095 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345097 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345100 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345102 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345105 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345107 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345110 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345112 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345115 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345118 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345120 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:14.353108 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345123 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:14.353983 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345125 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:14.353983 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345128 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:14.353983 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345130 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:14.353983 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345133 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:14.353983 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345136 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:14.353983 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.345139 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:14.353983 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.346007 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:14.355160 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.355138 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:36:14.355160 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.355161 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355213 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355218 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355222 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355225 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355229 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355232 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355234 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355237 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355240 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355242 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355245 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355248 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355250 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355253 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:14.355249 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355256 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355259 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355262 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355265 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355268 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355270 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355273 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355276 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355278 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355281 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355284 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355286 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355289 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355291 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355295 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355297 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355300 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355302 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355306 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355308 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:14.355611 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355311 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355314 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355316 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355319 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355321 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355324 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355327 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355329 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355332 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355335 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355338 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355340 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355343 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355345 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355349 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355351 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355354 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355357 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355360 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355362 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:14.356111 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355365 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355368 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355370 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355373 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355376 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355378 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355381 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355383 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355386 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355388 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355391 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355394 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355396 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355398 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355401 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355404 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355408 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355412 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355415 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:14.356591 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355420 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355424 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355427 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355431 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355434 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355437 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355440 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355443 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355446 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355449 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355451 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355454 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355457 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.355462 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355558 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355563 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:14.357060 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355566 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355569 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355573 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355576 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355579 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355582 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355584 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355587 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355590 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355593 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355595 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355598 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355601 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355603 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355606 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355608 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355611 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355614 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355617 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355620 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:14.357456 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355624 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355627 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355630 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355632 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355635 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355638 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355641 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355644 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355663 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355667 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355669 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355672 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355674 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355677 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355680 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355683 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355687 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355691 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355694 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:14.357948 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355697 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355700 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355703 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355706 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355709 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355711 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355714 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355716 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355719 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355722 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355725 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355727 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355730 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355733 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355736 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355738 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355742 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355744 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355747 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355750 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:14.358415 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355752 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355755 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355757 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355760 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355762 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355765 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355767 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355770 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355773 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355777 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355780 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355783 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355786 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355788 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355791 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355794 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355796 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355799 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355801 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355804 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:14.358994 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355807 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:14.359472 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355809 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:14.359472 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355812 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:14.359472 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355814 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:14.359472 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:14.355817 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:14.359472 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.355822 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:14.359472 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.356582 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:36:14.360115 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.360100 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:36:14.361568 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.361555 2577 server.go:1019] "Starting client certificate rotation" Apr 22 18:36:14.361681 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.361666 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:36:14.361719 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.361706 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:36:14.387345 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.387322 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:36:14.391491 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.391467 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:36:14.408093 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.408075 2577 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:36:14.414722 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.414706 2577 log.go:25] "Validated CRI v1 image API" Apr 22 18:36:14.416101 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.416084 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:36:14.423207 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.423186 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:36:14.424556 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.424531 2577 fs.go:135] Filesystem UUIDs: map[23708a5f-17dd-4c11-9481-fb072ff55161:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a6aab00c-45c8-45bd-8871-b7a4a3450d16:/dev/nvme0n1p4] Apr 22 18:36:14.424665 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.424554 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:36:14.430891 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.430759 2577 manager.go:217] Machine: {Timestamp:2026-04-22 18:36:14.428543617 +0000 UTC m=+0.454157275 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099743 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29f4f7b45d1d1a02b6be76d283a994 SystemUUID:ec29f4f7-b45d-1d1a-02b6-be76d283a994 BootID:bbee1c90-963f-4bc5-808e-6b9690122dd9 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3a:c1:e2:6e:53 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3a:c1:e2:6e:53 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:64:9f:a5:5f:f9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:36:14.430891 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.430879 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:36:14.431071 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.431016 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:36:14.433839 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.433802 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:36:14.434017 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.433840 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-204.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:36:14.434102 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.434033 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:36:14.434102 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.434046 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:36:14.434102 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.434065 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:36:14.434980 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.434967 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:36:14.436887 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.436873 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:36:14.437021 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.437010 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:36:14.439820 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.439808 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:36:14.439879 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.439828 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:36:14.439879 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.439845 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:36:14.439879 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.439859 2577 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:36:14.439879 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.439872 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:36:14.441098 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.441083 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:36:14.441175 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.441106 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:36:14.444825 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.444805 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:36:14.447294 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.447280 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:36:14.448291 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448273 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hb8gx" Apr 22 18:36:14.448851 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448836 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:36:14.448851 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448854 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:36:14.448986 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448860 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:36:14.448986 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448866 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:36:14.448986 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448872 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:36:14.448986 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448878 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:36:14.448986 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448884 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:36:14.448986 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448902 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:36:14.448986 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448909 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:36:14.448986 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448916 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:36:14.448986 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448931 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:36:14.448986 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.448941 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:36:14.450876 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.450865 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:36:14.450876 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.450875 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:36:14.454585 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.454571 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:36:14.454665 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.454607 2577 server.go:1295] "Started kubelet" Apr 22 18:36:14.454792 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.454741 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:36:14.454825 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.454817 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:36:14.455057 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.455027 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:36:14.455640 ip-10-0-132-204 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:36:14.456875 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.456855 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:36:14.457236 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.457222 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:36:14.457594 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.457572 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hb8gx" Apr 22 18:36:14.457861 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.457842 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-204.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:36:14.457949 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.457868 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:36:14.458007 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.457955 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-204.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:36:14.464126 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.462687 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-204.ec2.internal.18a8c1a6a3ce0f2b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-204.ec2.internal,UID:ip-10-0-132-204.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-204.ec2.internal,},FirstTimestamp:2026-04-22 18:36:14.454583083 +0000 UTC m=+0.480196742,LastTimestamp:2026-04-22 18:36:14.454583083 +0000 UTC m=+0.480196742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-204.ec2.internal,}" Apr 22 18:36:14.465821 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.465797 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:36:14.466441 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.466418 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:36:14.467811 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.467789 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:36:14.468281 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.468119 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:36:14.468369 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.468292 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:36:14.468552 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.468529 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:36:14.468552 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.468551 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:36:14.468693 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.468642 2577 factory.go:55] Registering systemd factory Apr 22 18:36:14.468693 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.468680 2577 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:36:14.468870 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.468848 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:36:14.469129 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.469033 2577 factory.go:153] Registering CRI-O factory Apr 22 18:36:14.469129 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.469049 2577 factory.go:223] Registration of the crio container factory successfully Apr 22 18:36:14.469129 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.469111 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:36:14.469305 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.469144 2577 factory.go:103] Registering Raw factory Apr 22 18:36:14.469305 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.469162 2577 manager.go:1196] Started watching for new ooms in manager Apr 22 18:36:14.469477 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.469458 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-204.ec2.internal\" not found" Apr 22 18:36:14.469599 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.469585 2577 manager.go:319] Starting recovery of all containers Apr 22 18:36:14.479664 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.479473 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:14.479771 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.479715 2577 manager.go:324] Recovery completed Apr 22 18:36:14.484371 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.484354 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:14.486777 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.486756 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-204.ec2.internal\" not found" node="ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.487017 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.487001 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:14.487095 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.487036 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:14.487095 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.487051 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:14.487611 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.487598 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:36:14.487680 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.487612 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:36:14.487680 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.487634 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:36:14.490485 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.490469 2577 policy_none.go:49] "None policy: Start" Apr 22 18:36:14.490546 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.490493 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:36:14.490546 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.490507 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:36:14.527332 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.527314 2577 manager.go:341] "Starting Device Plugin manager" Apr 22 18:36:14.540451 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.527361 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:36:14.540451 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.527375 2577 server.go:85] "Starting device plugin registration server" Apr 22 18:36:14.540451 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.527621 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:36:14.540451 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.527634 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:36:14.540451 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.527774 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:36:14.540451 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.527851 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:36:14.540451 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.527860 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:36:14.540451 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.528397 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:36:14.540451 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.528441 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-204.ec2.internal\" not found" Apr 22 18:36:14.597069 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.596996 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:36:14.598245 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.598218 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:36:14.598245 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.598248 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:36:14.598422 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.598268 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:36:14.598422 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.598275 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:36:14.598422 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.598306 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:36:14.601036 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.601018 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:14.628569 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.628542 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:14.629724 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.629704 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:14.629818 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.629736 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:14.629818 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.629748 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:14.629818 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.629772 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.640660 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.640633 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.640733 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.640669 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-204.ec2.internal\": node \"ip-10-0-132-204.ec2.internal\" not found" Apr 22 18:36:14.660556 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.660534 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-204.ec2.internal\" not found" Apr 22 18:36:14.698700 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.698663 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-204.ec2.internal"] Apr 22 18:36:14.698776 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.698757 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:14.699818 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.699792 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:14.699932 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.699824 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:14.699932 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.699834 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:14.701011 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.700997 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:14.701207 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.701191 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.701285 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.701228 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:14.701755 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.701741 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:14.701838 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.701769 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:14.701838 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.701784 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:14.701838 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.701787 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:14.701838 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.701805 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:14.701838 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.701825 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:14.702983 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.702966 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.703034 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.703002 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:14.703972 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.703957 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:14.704042 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.703977 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:14.704042 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.703987 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:14.730478 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.730454 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-204.ec2.internal\" not found" node="ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.734810 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.734794 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-204.ec2.internal\" not found" node="ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.761211 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.761185 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-204.ec2.internal\" not found" Apr 22 18:36:14.770063 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.770033 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ffa9aef1222b3a9934de39cb15b8e512-config\") pod \"kube-apiserver-proxy-ip-10-0-132-204.ec2.internal\" (UID: \"ffa9aef1222b3a9934de39cb15b8e512\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.770166 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.770067 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/24bede82695569fa794abd13553ca4ea-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal\" (UID: \"24bede82695569fa794abd13553ca4ea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.770166 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.770086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/24bede82695569fa794abd13553ca4ea-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal\" (UID: \"24bede82695569fa794abd13553ca4ea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.861494 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.861420 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-204.ec2.internal\" not found" Apr 22 18:36:14.870759 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.870740 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/24bede82695569fa794abd13553ca4ea-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal\" (UID: \"24bede82695569fa794abd13553ca4ea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.870835 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.870763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/24bede82695569fa794abd13553ca4ea-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal\" (UID: \"24bede82695569fa794abd13553ca4ea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.870835 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.870784 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ffa9aef1222b3a9934de39cb15b8e512-config\") pod \"kube-apiserver-proxy-ip-10-0-132-204.ec2.internal\" (UID: \"ffa9aef1222b3a9934de39cb15b8e512\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.870835 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.870822 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ffa9aef1222b3a9934de39cb15b8e512-config\") pod \"kube-apiserver-proxy-ip-10-0-132-204.ec2.internal\" (UID: \"ffa9aef1222b3a9934de39cb15b8e512\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.871331 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.871124 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/24bede82695569fa794abd13553ca4ea-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal\" (UID: \"24bede82695569fa794abd13553ca4ea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.871331 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:14.871130 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/24bede82695569fa794abd13553ca4ea-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal\" (UID: \"24bede82695569fa794abd13553ca4ea\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" Apr 22 18:36:14.961581 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:14.961545 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-204.ec2.internal\" not found" Apr 22 18:36:15.032006 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.031962 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" Apr 22 18:36:15.036513 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.036489 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-204.ec2.internal" Apr 22 18:36:15.062491 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:15.062451 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-204.ec2.internal\" not found" Apr 22 18:36:15.163062 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:15.162971 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-204.ec2.internal\" not found" Apr 22 18:36:15.263570 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:15.263539 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-204.ec2.internal\" not found" Apr 22 18:36:15.280891 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.280867 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:15.313436 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.313406 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:15.361012 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.360984 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:36:15.361575 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.361128 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:15.361575 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.361159 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:15.361575 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.361162 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:15.361575 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.361164 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:15.369408 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.369383 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" Apr 22 18:36:15.393105 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.393084 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:36:15.394125 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.394111 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-204.ec2.internal" Apr 22 18:36:15.406903 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.406870 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:36:15.441090 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.440998 2577 apiserver.go:52] "Watching apiserver" Apr 22 18:36:15.449097 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.449071 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:36:15.450159 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.450136 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-w5xvx","openshift-cluster-node-tuning-operator/tuned-l256r","openshift-multus/network-metrics-daemon-9t8m2","openshift-network-diagnostics/network-check-target-dt6rs","kube-system/kube-apiserver-proxy-ip-10-0-132-204.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj","openshift-image-registry/node-ca-fgx5l","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal","openshift-multus/multus-2qq9r","openshift-multus/multus-additional-cni-plugins-knnz5","openshift-network-operator/iptables-alerter-xhp7q","openshift-ovn-kubernetes/ovnkube-node-r65js"] Apr 22 18:36:15.452737 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.452718 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:15.454205 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.454186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.454317 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.454299 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:15.454426 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:15.454403 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:15.455533 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.455517 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:15.455593 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:15.455574 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:15.455884 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.455870 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:36:15.455935 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.455883 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fprjx\"" Apr 22 18:36:15.455966 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.455884 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:36:15.456896 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.456877 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.457181 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.457017 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:15.457181 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.457053 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:15.457311 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.457256 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lqdrk\"" Apr 22 18:36:15.458237 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.458210 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fgx5l" Apr 22 18:36:15.459146 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.459127 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:36:15.459419 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.459403 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.460781 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.460763 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.461894 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.461867 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:31:14 +0000 UTC" deadline="2027-12-01 21:14:43.898763946 +0000 UTC" Apr 22 18:36:15.461957 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.461895 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14114h38m28.436872251s" Apr 22 18:36:15.462364 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.462352 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xhp7q" Apr 22 18:36:15.463640 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.463615 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.463787 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mgxxx\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.463856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.463886 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.464218 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.464742 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.465086 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.465346 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.465403 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.465430 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6r4sp\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.465705 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.465754 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.465762 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dht7c\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.465888 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.466120 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6fjr7\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.466129 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:36:15.466301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.466122 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:15.467392 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.466758 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:36:15.467392 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.466935 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:36:15.467392 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.467101 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:15.467392 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.467285 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-m6fn9\"" Apr 22 18:36:15.467932 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.467769 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:36:15.468746 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.468726 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:36:15.468844 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.468733 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:36:15.468844 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.468751 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:36:15.468951 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.468918 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wcfs2\"" Apr 22 18:36:15.470174 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.470155 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:36:15.470174 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.470172 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:36:15.471394 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.471376 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:36:15.475228 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475204 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-sysctl-conf\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.475329 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475245 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5pgq\" (UniqueName: \"kubernetes.io/projected/66e01385-0150-4908-a5b3-deafda8e4e26-kube-api-access-t5pgq\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.475329 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475273 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwb9r\" (UniqueName: \"kubernetes.io/projected/e922d609-5d83-4221-8067-2166cabc52db-kube-api-access-fwb9r\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.475329 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475309 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-cnibin\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.475448 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475338 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-var-lib-openvswitch\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.475448 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475363 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-cni-bin\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.475448 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-host\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.475448 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5fcd2fa1-b6e0-4480-a123-665c7fabd2bf-host\") pod \"node-ca-fgx5l\" (UID: \"5fcd2fa1-b6e0-4480-a123-665c7fabd2bf\") " pod="openshift-image-registry/node-ca-fgx5l" Apr 22 18:36:15.475587 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475456 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-os-release\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.475587 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px4qk\" (UniqueName: \"kubernetes.io/projected/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-kube-api-access-px4qk\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.475587 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475511 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-kubernetes\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.475587 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-var-lib-kubelet\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.475587 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475542 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnbzs\" (UniqueName: \"kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs\") pod \"network-check-target-dt6rs\" (UID: \"a97e9a8d-908d-409a-b079-4fee4c52cdcd\") " pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:15.475587 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97302881-ab30-4630-9df7-e7796d6aaedf-ovnkube-config\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.475587 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475575 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-registration-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475589 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-run-ovn-kubernetes\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97302881-ab30-4630-9df7-e7796d6aaedf-ovn-node-metrics-cert\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475666 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-sysconfig\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475692 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66e01385-0150-4908-a5b3-deafda8e4e26-tmp\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-cni-dir\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475721 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-tuning-conf-dir\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-log-socket\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475750 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60e1bb1d-071b-40c6-aeca-a00ab7fbb348-konnectivity-ca\") pod \"konnectivity-agent-w5xvx\" (UID: \"60e1bb1d-071b-40c6-aeca-a00ab7fbb348\") " pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-sys\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-cni-binary-copy\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.475860 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-run-k8s-cni-cncf-io\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-cni-binary-copy\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-run-netns\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.475979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-run-openvswitch\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-slash\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/66e01385-0150-4908-a5b3-deafda8e4e26-etc-tuned\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476056 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6d9430f-dd4a-4cf0-9c59-1a20bf462166-host-slash\") pod \"iptables-alerter-xhp7q\" (UID: \"b6d9430f-dd4a-4cf0-9c59-1a20bf462166\") " pod="openshift-network-operator/iptables-alerter-xhp7q" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476096 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476120 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-socket-dir-parent\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-run-netns\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476168 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-conf-dir\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476191 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97302881-ab30-4630-9df7-e7796d6aaedf-ovnkube-script-lib\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60e1bb1d-071b-40c6-aeca-a00ab7fbb348-agent-certs\") pod \"konnectivity-agent-w5xvx\" (UID: \"60e1bb1d-071b-40c6-aeca-a00ab7fbb348\") " pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6d9430f-dd4a-4cf0-9c59-1a20bf462166-iptables-alerter-script\") pod \"iptables-alerter-xhp7q\" (UID: \"b6d9430f-dd4a-4cf0-9c59-1a20bf462166\") " pod="openshift-network-operator/iptables-alerter-xhp7q" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-device-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.476402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476311 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46mzg\" (UniqueName: \"kubernetes.io/projected/5fcd2fa1-b6e0-4480-a123-665c7fabd2bf-kube-api-access-46mzg\") pod \"node-ca-fgx5l\" (UID: \"5fcd2fa1-b6e0-4480-a123-665c7fabd2bf\") " pod="openshift-image-registry/node-ca-fgx5l" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476335 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-os-release\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476373 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-daemon-config\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-system-cni-dir\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-run-ovn\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476457 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-node-log\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-etc-openvswitch\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476507 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-run\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476542 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvhjb\" (UniqueName: \"kubernetes.io/projected/b6d9430f-dd4a-4cf0-9c59-1a20bf462166-kube-api-access-dvhjb\") pod \"iptables-alerter-xhp7q\" (UID: \"b6d9430f-dd4a-4cf0-9c59-1a20bf462166\") " pod="openshift-network-operator/iptables-alerter-xhp7q" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-sys-fs\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476626 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-var-lib-cni-multus\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-cnibin\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-run-systemd\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-hostroot\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-systemd-units\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476846 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-lib-modules\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.477068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-var-lib-kubelet\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97sz2\" (UniqueName: \"kubernetes.io/projected/97302881-ab30-4630-9df7-e7796d6aaedf-kube-api-access-97sz2\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.476966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-modprobe-d\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477002 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-systemd\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477029 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-kubelet\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477056 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-sysctl-d\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-socket-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-var-lib-cni-bin\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477162 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-cni-netd\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97302881-ab30-4630-9df7-e7796d6aaedf-env-overrides\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477202 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv5t8\" (UniqueName: \"kubernetes.io/projected/617c16ac-507f-45a2-ab75-d583c7798ca1-kube-api-access-zv5t8\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477226 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-system-cni-dir\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477301 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-etc-selinux\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477317 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5fcd2fa1-b6e0-4480-a123-665c7fabd2bf-serviceca\") pod \"node-ca-fgx5l\" (UID: \"5fcd2fa1-b6e0-4480-a123-665c7fabd2bf\") " pod="openshift-image-registry/node-ca-fgx5l" Apr 22 18:36:15.477603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-run-multus-certs\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.478285 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477369 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-etc-kubernetes\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.478285 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.477388 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwcdf\" (UniqueName: \"kubernetes.io/projected/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-kube-api-access-lwcdf\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.483711 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.483688 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:36:15.507712 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.507682 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h6j7r" Apr 22 18:36:15.515967 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.515927 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h6j7r" Apr 22 18:36:15.568398 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:15.568357 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa9aef1222b3a9934de39cb15b8e512.slice/crio-d40d553c42775d6890d6c5d27d896e98d1dd11bb6ac8403c2e8b7664e9037590 WatchSource:0}: Error finding container d40d553c42775d6890d6c5d27d896e98d1dd11bb6ac8403c2e8b7664e9037590: Status 404 returned error can't find the container with id d40d553c42775d6890d6c5d27d896e98d1dd11bb6ac8403c2e8b7664e9037590 Apr 22 18:36:15.569195 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:15.569151 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24bede82695569fa794abd13553ca4ea.slice/crio-44c47a8ba506e40f3f2fad478ac1368a314bbd10c538249e59232a2e402b81f3 WatchSource:0}: Error finding container 44c47a8ba506e40f3f2fad478ac1368a314bbd10c538249e59232a2e402b81f3: Status 404 returned error can't find the container with id 44c47a8ba506e40f3f2fad478ac1368a314bbd10c538249e59232a2e402b81f3 Apr 22 18:36:15.572960 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.572942 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:36:15.577861 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.577833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-daemon-config\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.577994 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.577877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-system-cni-dir\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.577994 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.577902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-run-ovn\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.577994 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.577954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-node-log\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.577994 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.577968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-system-cni-dir\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.577978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-etc-openvswitch\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-etc-openvswitch\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578039 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-run\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-run-ovn\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578070 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-node-log\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvhjb\" (UniqueName: \"kubernetes.io/projected/b6d9430f-dd4a-4cf0-9c59-1a20bf462166-kube-api-access-dvhjb\") pod \"iptables-alerter-xhp7q\" (UID: \"b6d9430f-dd4a-4cf0-9c59-1a20bf462166\") " pod="openshift-network-operator/iptables-alerter-xhp7q" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-sys-fs\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-run\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-var-lib-cni-multus\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578147 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-sys-fs\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-cnibin\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-var-lib-cni-multus\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578184 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-run-systemd\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.578196 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578198 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-cnibin\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-hostroot\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-run-systemd\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-systemd-units\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-lib-modules\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-var-lib-kubelet\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-hostroot\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578308 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-systemd-units\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97sz2\" (UniqueName: \"kubernetes.io/projected/97302881-ab30-4630-9df7-e7796d6aaedf-kube-api-access-97sz2\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578351 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-var-lib-kubelet\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-modprobe-d\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578429 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-lib-modules\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-modprobe-d\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-systemd\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-kubelet\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578485 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-systemd\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-sysctl-d\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.578995 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578528 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-socket-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578554 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-kubelet\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-sysctl-d\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578555 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-var-lib-cni-bin\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-daemon-config\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578599 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-cni-netd\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97302881-ab30-4630-9df7-e7796d6aaedf-env-overrides\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-var-lib-cni-bin\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578669 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-socket-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zv5t8\" (UniqueName: \"kubernetes.io/projected/617c16ac-507f-45a2-ab75-d583c7798ca1-kube-api-access-zv5t8\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578643 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-cni-netd\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-system-cni-dir\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578736 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-etc-selinux\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5fcd2fa1-b6e0-4480-a123-665c7fabd2bf-serviceca\") pod \"node-ca-fgx5l\" (UID: \"5fcd2fa1-b6e0-4480-a123-665c7fabd2bf\") " pod="openshift-image-registry/node-ca-fgx5l" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578780 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-system-cni-dir\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-run-multus-certs\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578856 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-etc-kubernetes\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.579961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-etc-selinux\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwcdf\" (UniqueName: \"kubernetes.io/projected/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-kube-api-access-lwcdf\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-sysctl-conf\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578932 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5pgq\" (UniqueName: \"kubernetes.io/projected/66e01385-0150-4908-a5b3-deafda8e4e26-kube-api-access-t5pgq\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwb9r\" (UniqueName: \"kubernetes.io/projected/e922d609-5d83-4221-8067-2166cabc52db-kube-api-access-fwb9r\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-cnibin\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.578987 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-etc-kubernetes\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-run-multus-certs\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-var-lib-openvswitch\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-cni-bin\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-var-lib-openvswitch\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-cni-bin\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-host\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5fcd2fa1-b6e0-4480-a123-665c7fabd2bf-host\") pod \"node-ca-fgx5l\" (UID: \"5fcd2fa1-b6e0-4480-a123-665c7fabd2bf\") " pod="openshift-image-registry/node-ca-fgx5l" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579135 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97302881-ab30-4630-9df7-e7796d6aaedf-env-overrides\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579155 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-os-release\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579129 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-sysctl-conf\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.580848 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579167 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-host\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579205 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-px4qk\" (UniqueName: \"kubernetes.io/projected/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-kube-api-access-px4qk\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-kubernetes\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-var-lib-kubelet\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnbzs\" (UniqueName: \"kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs\") pod \"network-check-target-dt6rs\" (UID: \"a97e9a8d-908d-409a-b079-4fee4c52cdcd\") " pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579283 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5fcd2fa1-b6e0-4480-a123-665c7fabd2bf-serviceca\") pod \"node-ca-fgx5l\" (UID: \"5fcd2fa1-b6e0-4480-a123-665c7fabd2bf\") " pod="openshift-image-registry/node-ca-fgx5l" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-cnibin\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5fcd2fa1-b6e0-4480-a123-665c7fabd2bf-host\") pod \"node-ca-fgx5l\" (UID: \"5fcd2fa1-b6e0-4480-a123-665c7fabd2bf\") " pod="openshift-image-registry/node-ca-fgx5l" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97302881-ab30-4630-9df7-e7796d6aaedf-ovnkube-config\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579367 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-registration-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-kubernetes\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-os-release\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579471 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-run-ovn-kubernetes\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579433 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-run-ovn-kubernetes\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-registration-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579579 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-var-lib-kubelet\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.581752 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97302881-ab30-4630-9df7-e7796d6aaedf-ovn-node-metrics-cert\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579628 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579676 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-sysconfig\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66e01385-0150-4908-a5b3-deafda8e4e26-tmp\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579731 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-cni-dir\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579782 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-etc-sysconfig\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-tuning-conf-dir\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579822 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-log-socket\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579837 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-cni-dir\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60e1bb1d-071b-40c6-aeca-a00ab7fbb348-konnectivity-ca\") pod \"konnectivity-agent-w5xvx\" (UID: \"60e1bb1d-071b-40c6-aeca-a00ab7fbb348\") " pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-sys\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-cni-binary-copy\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579901 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97302881-ab30-4630-9df7-e7796d6aaedf-ovnkube-config\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-run-k8s-cni-cncf-io\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579950 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-cni-binary-copy\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579959 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-tuning-conf-dir\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579956 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:36:15.582553 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-run-netns\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-run-k8s-cni-cncf-io\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-run-openvswitch\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-run-netns\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.579954 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-log-socket\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-slash\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580103 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/66e01385-0150-4908-a5b3-deafda8e4e26-etc-tuned\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6d9430f-dd4a-4cf0-9c59-1a20bf462166-host-slash\") pod \"iptables-alerter-xhp7q\" (UID: \"b6d9430f-dd4a-4cf0-9c59-1a20bf462166\") " pod="openshift-network-operator/iptables-alerter-xhp7q" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:15.580146 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-socket-dir-parent\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:15.580218 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs podName:617c16ac-507f-45a2-ab75-d583c7798ca1 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:16.08018504 +0000 UTC m=+2.105798687 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs") pod "network-metrics-daemon-9t8m2" (UID: "617c16ac-507f-45a2-ab75-d583c7798ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-socket-dir-parent\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-run-netns\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-run-openvswitch\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580293 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-host-run-netns\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.583343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580308 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-conf-dir\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97302881-ab30-4630-9df7-e7796d6aaedf-ovnkube-script-lib\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60e1bb1d-071b-40c6-aeca-a00ab7fbb348-agent-certs\") pod \"konnectivity-agent-w5xvx\" (UID: \"60e1bb1d-071b-40c6-aeca-a00ab7fbb348\") " pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-cni-binary-copy\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-cni-binary-copy\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580381 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b6d9430f-dd4a-4cf0-9c59-1a20bf462166-host-slash\") pod \"iptables-alerter-xhp7q\" (UID: \"b6d9430f-dd4a-4cf0-9c59-1a20bf462166\") " pod="openshift-network-operator/iptables-alerter-xhp7q" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6d9430f-dd4a-4cf0-9c59-1a20bf462166-iptables-alerter-script\") pod \"iptables-alerter-xhp7q\" (UID: \"b6d9430f-dd4a-4cf0-9c59-1a20bf462166\") " pod="openshift-network-operator/iptables-alerter-xhp7q" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-device-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/60e1bb1d-071b-40c6-aeca-a00ab7fbb348-konnectivity-ca\") pod \"konnectivity-agent-w5xvx\" (UID: \"60e1bb1d-071b-40c6-aeca-a00ab7fbb348\") " pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46mzg\" (UniqueName: \"kubernetes.io/projected/5fcd2fa1-b6e0-4480-a123-665c7fabd2bf-kube-api-access-46mzg\") pod \"node-ca-fgx5l\" (UID: \"5fcd2fa1-b6e0-4480-a123-665c7fabd2bf\") " pod="openshift-image-registry/node-ca-fgx5l" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-multus-conf-dir\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580499 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-os-release\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/66e01385-0150-4908-a5b3-deafda8e4e26-sys\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580597 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-device-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-os-release\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580635 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97302881-ab30-4630-9df7-e7796d6aaedf-host-slash\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e922d609-5d83-4221-8067-2166cabc52db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580882 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97302881-ab30-4630-9df7-e7796d6aaedf-ovnkube-script-lib\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.583833 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.580969 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b6d9430f-dd4a-4cf0-9c59-1a20bf462166-iptables-alerter-script\") pod \"iptables-alerter-xhp7q\" (UID: \"b6d9430f-dd4a-4cf0-9c59-1a20bf462166\") " pod="openshift-network-operator/iptables-alerter-xhp7q" Apr 22 18:36:15.584295 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.583044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/66e01385-0150-4908-a5b3-deafda8e4e26-etc-tuned\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.584295 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.583091 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/66e01385-0150-4908-a5b3-deafda8e4e26-tmp\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.584295 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.583163 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97302881-ab30-4630-9df7-e7796d6aaedf-ovn-node-metrics-cert\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.584295 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.583349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/60e1bb1d-071b-40c6-aeca-a00ab7fbb348-agent-certs\") pod \"konnectivity-agent-w5xvx\" (UID: \"60e1bb1d-071b-40c6-aeca-a00ab7fbb348\") " pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:15.591443 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.591410 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvhjb\" (UniqueName: \"kubernetes.io/projected/b6d9430f-dd4a-4cf0-9c59-1a20bf462166-kube-api-access-dvhjb\") pod \"iptables-alerter-xhp7q\" (UID: \"b6d9430f-dd4a-4cf0-9c59-1a20bf462166\") " pod="openshift-network-operator/iptables-alerter-xhp7q" Apr 22 18:36:15.592545 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:15.592522 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:15.592639 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:15.592551 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:15.592639 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:15.592578 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dnbzs for pod openshift-network-diagnostics/network-check-target-dt6rs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:15.592779 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:15.592676 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs podName:a97e9a8d-908d-409a-b079-4fee4c52cdcd nodeName:}" failed. No retries permitted until 2026-04-22 18:36:16.092643468 +0000 UTC m=+2.118257146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dnbzs" (UniqueName: "kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs") pod "network-check-target-dt6rs" (UID: "a97e9a8d-908d-409a-b079-4fee4c52cdcd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:15.593939 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.593912 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97sz2\" (UniqueName: \"kubernetes.io/projected/97302881-ab30-4630-9df7-e7796d6aaedf-kube-api-access-97sz2\") pod \"ovnkube-node-r65js\" (UID: \"97302881-ab30-4630-9df7-e7796d6aaedf\") " pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.594251 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.594206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5pgq\" (UniqueName: \"kubernetes.io/projected/66e01385-0150-4908-a5b3-deafda8e4e26-kube-api-access-t5pgq\") pod \"tuned-l256r\" (UID: \"66e01385-0150-4908-a5b3-deafda8e4e26\") " pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.594396 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.594378 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv5t8\" (UniqueName: \"kubernetes.io/projected/617c16ac-507f-45a2-ab75-d583c7798ca1-kube-api-access-zv5t8\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:15.594855 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.594837 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwcdf\" (UniqueName: \"kubernetes.io/projected/8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2-kube-api-access-lwcdf\") pod \"multus-2qq9r\" (UID: \"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2\") " pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.595125 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.595109 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46mzg\" (UniqueName: \"kubernetes.io/projected/5fcd2fa1-b6e0-4480-a123-665c7fabd2bf-kube-api-access-46mzg\") pod \"node-ca-fgx5l\" (UID: \"5fcd2fa1-b6e0-4480-a123-665c7fabd2bf\") " pod="openshift-image-registry/node-ca-fgx5l" Apr 22 18:36:15.595270 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.595254 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwb9r\" (UniqueName: \"kubernetes.io/projected/e922d609-5d83-4221-8067-2166cabc52db-kube-api-access-fwb9r\") pod \"aws-ebs-csi-driver-node-xr2dj\" (UID: \"e922d609-5d83-4221-8067-2166cabc52db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.595514 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.595499 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-px4qk\" (UniqueName: \"kubernetes.io/projected/1ca4dacd-dae4-4a9c-a9f6-96d152edbeac-kube-api-access-px4qk\") pod \"multus-additional-cni-plugins-knnz5\" (UID: \"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac\") " pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.601545 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.601501 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" event={"ID":"24bede82695569fa794abd13553ca4ea","Type":"ContainerStarted","Data":"44c47a8ba506e40f3f2fad478ac1368a314bbd10c538249e59232a2e402b81f3"} Apr 22 18:36:15.602335 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.602314 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-204.ec2.internal" event={"ID":"ffa9aef1222b3a9934de39cb15b8e512","Type":"ContainerStarted","Data":"d40d553c42775d6890d6c5d27d896e98d1dd11bb6ac8403c2e8b7664e9037590"} Apr 22 18:36:15.781461 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.781426 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:15.788364 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:15.788335 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60e1bb1d_071b_40c6_aeca_a00ab7fbb348.slice/crio-a31f2dc385843ee4fc274c2967441517072e468f841f9a1d92ee32e950ed8616 WatchSource:0}: Error finding container a31f2dc385843ee4fc274c2967441517072e468f841f9a1d92ee32e950ed8616: Status 404 returned error can't find the container with id a31f2dc385843ee4fc274c2967441517072e468f841f9a1d92ee32e950ed8616 Apr 22 18:36:15.798276 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.798252 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-l256r" Apr 22 18:36:15.804399 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:15.804363 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66e01385_0150_4908_a5b3_deafda8e4e26.slice/crio-2ab80ea7c23323b2586015b0ca56023212467da5ed6264914a2a8fd5fca0c7b9 WatchSource:0}: Error finding container 2ab80ea7c23323b2586015b0ca56023212467da5ed6264914a2a8fd5fca0c7b9: Status 404 returned error can't find the container with id 2ab80ea7c23323b2586015b0ca56023212467da5ed6264914a2a8fd5fca0c7b9 Apr 22 18:36:15.811300 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.811279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" Apr 22 18:36:15.817421 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:15.817393 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode922d609_5d83_4221_8067_2166cabc52db.slice/crio-61fa59ef7ea4367e31dde9fbfca7c4240d8f22122bfd74390fdba0aee307878f WatchSource:0}: Error finding container 61fa59ef7ea4367e31dde9fbfca7c4240d8f22122bfd74390fdba0aee307878f: Status 404 returned error can't find the container with id 61fa59ef7ea4367e31dde9fbfca7c4240d8f22122bfd74390fdba0aee307878f Apr 22 18:36:15.823713 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.823691 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fgx5l" Apr 22 18:36:15.830742 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:15.830709 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fcd2fa1_b6e0_4480_a123_665c7fabd2bf.slice/crio-9ac921e7c198a7ab71f67700d5e342ff8ec07328368dfeaa31a04f75928f0bc7 WatchSource:0}: Error finding container 9ac921e7c198a7ab71f67700d5e342ff8ec07328368dfeaa31a04f75928f0bc7: Status 404 returned error can't find the container with id 9ac921e7c198a7ab71f67700d5e342ff8ec07328368dfeaa31a04f75928f0bc7 Apr 22 18:36:15.842434 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.842409 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2qq9r" Apr 22 18:36:15.849805 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:15.849774 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f88ea3a_0984_4a92_a346_4c9f9b5cbdc2.slice/crio-768b3202aa249f6268aa08fb3fe18c78b93fb43da0acca887f56b47be98d8789 WatchSource:0}: Error finding container 768b3202aa249f6268aa08fb3fe18c78b93fb43da0acca887f56b47be98d8789: Status 404 returned error can't find the container with id 768b3202aa249f6268aa08fb3fe18c78b93fb43da0acca887f56b47be98d8789 Apr 22 18:36:15.854164 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.854142 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-knnz5" Apr 22 18:36:15.859758 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:15.859726 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca4dacd_dae4_4a9c_a9f6_96d152edbeac.slice/crio-74d8e68c376bba9b58736ae3ce64d9937e4e98a24b84d1c7441f3897f30674b3 WatchSource:0}: Error finding container 74d8e68c376bba9b58736ae3ce64d9937e4e98a24b84d1c7441f3897f30674b3: Status 404 returned error can't find the container with id 74d8e68c376bba9b58736ae3ce64d9937e4e98a24b84d1c7441f3897f30674b3 Apr 22 18:36:15.865568 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.865549 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xhp7q" Apr 22 18:36:15.872496 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:15.872458 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6d9430f_dd4a_4cf0_9c59_1a20bf462166.slice/crio-82cb86ca3eb9199db44515292f7e5c4ce0e794c940ad0a28f5fd27c2fa3de1e5 WatchSource:0}: Error finding container 82cb86ca3eb9199db44515292f7e5c4ce0e794c940ad0a28f5fd27c2fa3de1e5: Status 404 returned error can't find the container with id 82cb86ca3eb9199db44515292f7e5c4ce0e794c940ad0a28f5fd27c2fa3de1e5 Apr 22 18:36:15.878705 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:15.878686 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:15.884731 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:15.884706 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97302881_ab30_4630_9df7_e7796d6aaedf.slice/crio-1357f8ce489ab8739221afb6e48308a8de6af26b39dc55da222048e8ae02cd1a WatchSource:0}: Error finding container 1357f8ce489ab8739221afb6e48308a8de6af26b39dc55da222048e8ae02cd1a: Status 404 returned error can't find the container with id 1357f8ce489ab8739221afb6e48308a8de6af26b39dc55da222048e8ae02cd1a Apr 22 18:36:16.084254 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.084156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:16.084406 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:16.084344 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:16.084468 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:16.084416 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs podName:617c16ac-507f-45a2-ab75-d583c7798ca1 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:17.084394692 +0000 UTC m=+3.110008352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs") pod "network-metrics-daemon-9t8m2" (UID: "617c16ac-507f-45a2-ab75-d583c7798ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:16.184748 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.184704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnbzs\" (UniqueName: \"kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs\") pod \"network-check-target-dt6rs\" (UID: \"a97e9a8d-908d-409a-b079-4fee4c52cdcd\") " pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:16.184939 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:16.184882 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:16.184939 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:16.184907 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:16.184939 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:16.184920 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dnbzs for pod openshift-network-diagnostics/network-check-target-dt6rs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:16.185106 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:16.184976 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs podName:a97e9a8d-908d-409a-b079-4fee4c52cdcd nodeName:}" failed. No retries permitted until 2026-04-22 18:36:17.184957003 +0000 UTC m=+3.210570662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnbzs" (UniqueName: "kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs") pod "network-check-target-dt6rs" (UID: "a97e9a8d-908d-409a-b079-4fee4c52cdcd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:16.196661 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.196615 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:16.345763 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.345680 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:16.516742 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.516699 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:31:15 +0000 UTC" deadline="2027-12-27 16:15:15.631848449 +0000 UTC" Apr 22 18:36:16.516742 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.516738 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14733h38m59.115114333s" Apr 22 18:36:16.623158 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.623048 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2qq9r" event={"ID":"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2","Type":"ContainerStarted","Data":"768b3202aa249f6268aa08fb3fe18c78b93fb43da0acca887f56b47be98d8789"} Apr 22 18:36:16.639551 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.639512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fgx5l" event={"ID":"5fcd2fa1-b6e0-4480-a123-665c7fabd2bf","Type":"ContainerStarted","Data":"9ac921e7c198a7ab71f67700d5e342ff8ec07328368dfeaa31a04f75928f0bc7"} Apr 22 18:36:16.662987 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.662945 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-l256r" event={"ID":"66e01385-0150-4908-a5b3-deafda8e4e26","Type":"ContainerStarted","Data":"2ab80ea7c23323b2586015b0ca56023212467da5ed6264914a2a8fd5fca0c7b9"} Apr 22 18:36:16.673086 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.673013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knnz5" event={"ID":"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac","Type":"ContainerStarted","Data":"74d8e68c376bba9b58736ae3ce64d9937e4e98a24b84d1c7441f3897f30674b3"} Apr 22 18:36:16.677179 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.677118 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" event={"ID":"e922d609-5d83-4221-8067-2166cabc52db","Type":"ContainerStarted","Data":"61fa59ef7ea4367e31dde9fbfca7c4240d8f22122bfd74390fdba0aee307878f"} Apr 22 18:36:16.689787 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.689755 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w5xvx" event={"ID":"60e1bb1d-071b-40c6-aeca-a00ab7fbb348","Type":"ContainerStarted","Data":"a31f2dc385843ee4fc274c2967441517072e468f841f9a1d92ee32e950ed8616"} Apr 22 18:36:16.694350 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.694293 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" event={"ID":"97302881-ab30-4630-9df7-e7796d6aaedf","Type":"ContainerStarted","Data":"1357f8ce489ab8739221afb6e48308a8de6af26b39dc55da222048e8ae02cd1a"} Apr 22 18:36:16.706109 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:16.706070 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xhp7q" event={"ID":"b6d9430f-dd4a-4cf0-9c59-1a20bf462166","Type":"ContainerStarted","Data":"82cb86ca3eb9199db44515292f7e5c4ce0e794c940ad0a28f5fd27c2fa3de1e5"} Apr 22 18:36:17.092801 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:17.092762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:17.092993 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:17.092915 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:17.092993 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:17.092987 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs podName:617c16ac-507f-45a2-ab75-d583c7798ca1 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:19.092968202 +0000 UTC m=+5.118581864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs") pod "network-metrics-daemon-9t8m2" (UID: "617c16ac-507f-45a2-ab75-d583c7798ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:17.193277 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:17.193233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnbzs\" (UniqueName: \"kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs\") pod \"network-check-target-dt6rs\" (UID: \"a97e9a8d-908d-409a-b079-4fee4c52cdcd\") " pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:17.193510 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:17.193491 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:17.193596 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:17.193515 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:17.193596 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:17.193529 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dnbzs for pod openshift-network-diagnostics/network-check-target-dt6rs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:17.193596 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:17.193587 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs podName:a97e9a8d-908d-409a-b079-4fee4c52cdcd nodeName:}" failed. No retries permitted until 2026-04-22 18:36:19.193568961 +0000 UTC m=+5.219182625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnbzs" (UniqueName: "kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs") pod "network-check-target-dt6rs" (UID: "a97e9a8d-908d-409a-b079-4fee4c52cdcd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:17.517286 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:17.517210 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:31:15 +0000 UTC" deadline="2027-12-05 12:05:17.375380725 +0000 UTC" Apr 22 18:36:17.517286 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:17.517257 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14201h28m59.858129251s" Apr 22 18:36:17.599499 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:17.599376 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:17.599715 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:17.599501 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:17.599715 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:17.599543 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:17.599715 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:17.599683 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:18.126739 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:18.126703 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:18.500766 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:18.500513 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:19.111623 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:19.111573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:19.112049 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:19.111794 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:19.112049 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:19.111865 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs podName:617c16ac-507f-45a2-ab75-d583c7798ca1 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:23.111845598 +0000 UTC m=+9.137459249 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs") pod "network-metrics-daemon-9t8m2" (UID: "617c16ac-507f-45a2-ab75-d583c7798ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:19.212631 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:19.212262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnbzs\" (UniqueName: \"kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs\") pod \"network-check-target-dt6rs\" (UID: \"a97e9a8d-908d-409a-b079-4fee4c52cdcd\") " pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:19.212631 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:19.212470 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:19.212631 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:19.212490 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:19.212631 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:19.212503 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dnbzs for pod openshift-network-diagnostics/network-check-target-dt6rs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:19.212631 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:19.212572 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs podName:a97e9a8d-908d-409a-b079-4fee4c52cdcd nodeName:}" failed. No retries permitted until 2026-04-22 18:36:23.212551177 +0000 UTC m=+9.238164844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnbzs" (UniqueName: "kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs") pod "network-check-target-dt6rs" (UID: "a97e9a8d-908d-409a-b079-4fee4c52cdcd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:19.598869 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:19.598829 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:19.598869 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:19.598856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:19.599113 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:19.598962 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:19.599185 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:19.599140 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:21.598979 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:21.598939 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:21.599458 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:21.598996 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:21.599458 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:21.599121 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:21.599458 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:21.599217 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:23.144679 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:23.144514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:23.144679 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:23.144678 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:23.145212 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:23.144748 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs podName:617c16ac-507f-45a2-ab75-d583c7798ca1 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:31.144730041 +0000 UTC m=+17.170343694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs") pod "network-metrics-daemon-9t8m2" (UID: "617c16ac-507f-45a2-ab75-d583c7798ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:23.245335 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:23.245279 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnbzs\" (UniqueName: \"kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs\") pod \"network-check-target-dt6rs\" (UID: \"a97e9a8d-908d-409a-b079-4fee4c52cdcd\") " pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:23.245505 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:23.245478 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:23.245505 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:23.245500 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:23.245646 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:23.245512 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dnbzs for pod openshift-network-diagnostics/network-check-target-dt6rs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:23.245646 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:23.245578 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs podName:a97e9a8d-908d-409a-b079-4fee4c52cdcd nodeName:}" failed. No retries permitted until 2026-04-22 18:36:31.245558684 +0000 UTC m=+17.271172354 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnbzs" (UniqueName: "kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs") pod "network-check-target-dt6rs" (UID: "a97e9a8d-908d-409a-b079-4fee4c52cdcd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:23.599554 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:23.599515 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:23.599768 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:23.599666 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:23.599768 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:23.599730 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:23.599889 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:23.599830 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:25.598734 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:25.598497 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:25.599178 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:25.598497 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:25.599178 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:25.598849 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:25.599178 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:25.598915 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:27.598672 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:27.598628 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:27.599110 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:27.598757 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:27.599110 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:27.598820 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:27.599110 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:27.598925 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:29.598725 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:29.598687 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:29.599157 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:29.598687 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:29.599157 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:29.598826 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:29.599157 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:29.598922 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:31.203323 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:31.203291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:31.203774 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:31.203426 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:31.203774 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:31.203483 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs podName:617c16ac-507f-45a2-ab75-d583c7798ca1 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.203467204 +0000 UTC m=+33.229080854 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs") pod "network-metrics-daemon-9t8m2" (UID: "617c16ac-507f-45a2-ab75-d583c7798ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:31.303666 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:31.303612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnbzs\" (UniqueName: \"kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs\") pod \"network-check-target-dt6rs\" (UID: \"a97e9a8d-908d-409a-b079-4fee4c52cdcd\") " pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:31.303840 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:31.303776 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:31.303840 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:31.303799 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:31.303840 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:31.303810 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dnbzs for pod openshift-network-diagnostics/network-check-target-dt6rs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:31.303958 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:31.303874 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs podName:a97e9a8d-908d-409a-b079-4fee4c52cdcd nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.303854452 +0000 UTC m=+33.329468121 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnbzs" (UniqueName: "kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs") pod "network-check-target-dt6rs" (UID: "a97e9a8d-908d-409a-b079-4fee4c52cdcd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:31.598787 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:31.598707 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:31.598945 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:31.598707 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:31.598945 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:31.598851 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:31.598945 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:31.598938 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:33.598949 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:33.598926 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:33.599229 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:33.598934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:33.599229 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:33.599015 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:33.599229 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:33.599090 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:34.744044 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.743760 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 18:36:34.744674 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.744313 2577 generic.go:358] "Generic (PLEG): container finished" podID="97302881-ab30-4630-9df7-e7796d6aaedf" containerID="85f7ed52a04fb5584cf66189c848ea4cb54434c2ead3f3eff11f153792373777" exitCode=1 Apr 22 18:36:34.744674 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.744377 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" event={"ID":"97302881-ab30-4630-9df7-e7796d6aaedf","Type":"ContainerStarted","Data":"7ea84740dc9d26e741be6a81f6470dcad54a60e4cee47b69151efac88268f64f"} Apr 22 18:36:34.744674 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.744406 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" event={"ID":"97302881-ab30-4630-9df7-e7796d6aaedf","Type":"ContainerStarted","Data":"b59d50abdaed1911c0760655108897568e051b76340661b46a3eb2c31e8bf71b"} Apr 22 18:36:34.744674 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.744422 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" event={"ID":"97302881-ab30-4630-9df7-e7796d6aaedf","Type":"ContainerStarted","Data":"9f921a0a2ba1ccb7faf4cfa6546159960992b7de4ba8d942082a38adc0309071"} Apr 22 18:36:34.744674 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.744438 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" event={"ID":"97302881-ab30-4630-9df7-e7796d6aaedf","Type":"ContainerStarted","Data":"d1317cb95fb947a7ad85b34d3b2f8473f7b0fda9250182d6cf116803a8e044cd"} Apr 22 18:36:34.744674 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.744450 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" event={"ID":"97302881-ab30-4630-9df7-e7796d6aaedf","Type":"ContainerDied","Data":"85f7ed52a04fb5584cf66189c848ea4cb54434c2ead3f3eff11f153792373777"} Apr 22 18:36:34.744674 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.744460 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" event={"ID":"97302881-ab30-4630-9df7-e7796d6aaedf","Type":"ContainerStarted","Data":"573a6bc48aa9f88cf7c215407880663f45893e402687ec37901abe0b8b06f81a"} Apr 22 18:36:34.745677 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.745636 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2qq9r" event={"ID":"8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2","Type":"ContainerStarted","Data":"6fafd2b250cc2e9181f67d20dbc7cb3da9b7ad0483cdf3fa37e1f8a491bc3cbc"} Apr 22 18:36:34.746989 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.746961 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fgx5l" event={"ID":"5fcd2fa1-b6e0-4480-a123-665c7fabd2bf","Type":"ContainerStarted","Data":"071d8ecd2ae7c0349ff8694155096a5d9bddd2827056d81de66de8de4298f05e"} Apr 22 18:36:34.748213 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.748194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-l256r" event={"ID":"66e01385-0150-4908-a5b3-deafda8e4e26","Type":"ContainerStarted","Data":"9c5768dd7805d819851029438945dd9773c09e728ef4660c3800307bc08bd627"} Apr 22 18:36:34.749470 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.749448 2577 generic.go:358] "Generic (PLEG): container finished" podID="1ca4dacd-dae4-4a9c-a9f6-96d152edbeac" containerID="558af33c59be06d71a5f66eeee7f74408acee934eb9eb09f5826c179fcaeb8a1" exitCode=0 Apr 22 18:36:34.749576 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.749478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knnz5" event={"ID":"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac","Type":"ContainerDied","Data":"558af33c59be06d71a5f66eeee7f74408acee934eb9eb09f5826c179fcaeb8a1"} Apr 22 18:36:34.750814 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.750792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" event={"ID":"e922d609-5d83-4221-8067-2166cabc52db","Type":"ContainerStarted","Data":"6151a173cc20b42e8003294c82bde8ccc86a76d8699c7be4e4420d2a929ab2f9"} Apr 22 18:36:34.752098 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.752069 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w5xvx" event={"ID":"60e1bb1d-071b-40c6-aeca-a00ab7fbb348","Type":"ContainerStarted","Data":"6dc0a3aaae1f45b4f502d4f58e0f5b8127b154f5aedc7a1ebf34283a68ae0fe8"} Apr 22 18:36:34.755150 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.755105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-204.ec2.internal" event={"ID":"ffa9aef1222b3a9934de39cb15b8e512","Type":"ContainerStarted","Data":"172e027713beff5f40e06e08e4995909f1da02c193656ff48b0ee08c390275e6"} Apr 22 18:36:34.756913 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.756889 2577 generic.go:358] "Generic (PLEG): container finished" podID="24bede82695569fa794abd13553ca4ea" containerID="1d56c589ba765e704d59b3a86c9b74e7aae2ec3e1a254ca9baed14b0435fd4cb" exitCode=0 Apr 22 18:36:34.757009 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.756923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" event={"ID":"24bede82695569fa794abd13553ca4ea","Type":"ContainerDied","Data":"1d56c589ba765e704d59b3a86c9b74e7aae2ec3e1a254ca9baed14b0435fd4cb"} Apr 22 18:36:34.766626 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.766577 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2qq9r" podStartSLOduration=3.075327671 podStartE2EDuration="20.76656252s" podCreationTimestamp="2026-04-22 18:36:14 +0000 UTC" firstStartedPulling="2026-04-22 18:36:15.851708297 +0000 UTC m=+1.877321942" lastFinishedPulling="2026-04-22 18:36:33.542943144 +0000 UTC m=+19.568556791" observedRunningTime="2026-04-22 18:36:34.766369756 +0000 UTC m=+20.791983422" watchObservedRunningTime="2026-04-22 18:36:34.76656252 +0000 UTC m=+20.792176189" Apr 22 18:36:34.786551 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.784126 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-w5xvx" podStartSLOduration=3.062960214 podStartE2EDuration="20.784109254s" podCreationTimestamp="2026-04-22 18:36:14 +0000 UTC" firstStartedPulling="2026-04-22 18:36:15.790075462 +0000 UTC m=+1.815689108" lastFinishedPulling="2026-04-22 18:36:33.511224492 +0000 UTC m=+19.536838148" observedRunningTime="2026-04-22 18:36:34.784099483 +0000 UTC m=+20.809713151" watchObservedRunningTime="2026-04-22 18:36:34.784109254 +0000 UTC m=+20.809722922" Apr 22 18:36:34.801015 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.800959 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fgx5l" podStartSLOduration=3.121491377 podStartE2EDuration="20.800943557s" podCreationTimestamp="2026-04-22 18:36:14 +0000 UTC" firstStartedPulling="2026-04-22 18:36:15.832146959 +0000 UTC m=+1.857760604" lastFinishedPulling="2026-04-22 18:36:33.511599135 +0000 UTC m=+19.537212784" observedRunningTime="2026-04-22 18:36:34.800832451 +0000 UTC m=+20.826446119" watchObservedRunningTime="2026-04-22 18:36:34.800943557 +0000 UTC m=+20.826557226" Apr 22 18:36:34.867210 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.867156 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-204.ec2.internal" podStartSLOduration=19.867136632 podStartE2EDuration="19.867136632s" podCreationTimestamp="2026-04-22 18:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:34.867083731 +0000 UTC m=+20.892697400" watchObservedRunningTime="2026-04-22 18:36:34.867136632 +0000 UTC m=+20.892750286" Apr 22 18:36:34.889841 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:34.889800 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-l256r" podStartSLOduration=3.151702127 podStartE2EDuration="20.889785929s" podCreationTimestamp="2026-04-22 18:36:14 +0000 UTC" firstStartedPulling="2026-04-22 18:36:15.805874862 +0000 UTC m=+1.831488507" lastFinishedPulling="2026-04-22 18:36:33.543958659 +0000 UTC m=+19.569572309" observedRunningTime="2026-04-22 18:36:34.889491956 +0000 UTC m=+20.915105624" watchObservedRunningTime="2026-04-22 18:36:34.889785929 +0000 UTC m=+20.915399596" Apr 22 18:36:35.277681 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:35.277641 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:36:35.537408 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:35.537302 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:36:35.27767776Z","UUID":"1689bfa5-597e-4ecb-a1b3-f8bc7061f377","Handler":null,"Name":"","Endpoint":""} Apr 22 18:36:35.540106 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:35.540080 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:36:35.540106 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:35.540112 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:36:35.599039 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:35.598999 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:35.599220 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:35.599136 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:35.599312 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:35.598999 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:35.599439 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:35.599388 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:35.759868 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:35.759829 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" event={"ID":"e922d609-5d83-4221-8067-2166cabc52db","Type":"ContainerStarted","Data":"d13aa7c4409477fa639ef54872e8b20779151163d28c3c55ce83138e352fcdc0"} Apr 22 18:36:35.761629 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:35.761583 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" event={"ID":"24bede82695569fa794abd13553ca4ea","Type":"ContainerStarted","Data":"fcff2cb02dc7f7a19a7837e3540db921a48a0ac7e37c85e122c09f4a464ed986"} Apr 22 18:36:35.762984 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:35.762921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xhp7q" event={"ID":"b6d9430f-dd4a-4cf0-9c59-1a20bf462166","Type":"ContainerStarted","Data":"65edc4adeb9e7c8511782fae67595833cb756958893c23225a82586f3ec4c107"} Apr 22 18:36:35.781285 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:35.781222 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-204.ec2.internal" podStartSLOduration=20.781206605 podStartE2EDuration="20.781206605s" podCreationTimestamp="2026-04-22 18:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:35.781137337 +0000 UTC m=+21.806751003" watchObservedRunningTime="2026-04-22 18:36:35.781206605 +0000 UTC m=+21.806820273" Apr 22 18:36:35.803529 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:35.803430 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xhp7q" podStartSLOduration=4.136688979 podStartE2EDuration="21.80341463s" podCreationTimestamp="2026-04-22 18:36:14 +0000 UTC" firstStartedPulling="2026-04-22 18:36:15.874226649 +0000 UTC m=+1.899840295" lastFinishedPulling="2026-04-22 18:36:33.540952301 +0000 UTC m=+19.566565946" observedRunningTime="2026-04-22 18:36:35.803170033 +0000 UTC m=+21.828783712" watchObservedRunningTime="2026-04-22 18:36:35.80341463 +0000 UTC m=+21.829028296" Apr 22 18:36:36.768127 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:36.768098 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 18:36:36.768873 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:36.768447 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" event={"ID":"97302881-ab30-4630-9df7-e7796d6aaedf","Type":"ContainerStarted","Data":"400c657945035020a2420d769335bfd30df34e974d98d80a5a0e89a822bd0b9c"} Apr 22 18:36:36.770512 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:36.770479 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" event={"ID":"e922d609-5d83-4221-8067-2166cabc52db","Type":"ContainerStarted","Data":"d73016ba99ac772e452bf069249586643386f61c43b58ad5d9570b095fb63fa5"} Apr 22 18:36:36.793813 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:36.793766 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xr2dj" podStartSLOduration=2.342192599 podStartE2EDuration="22.793751229s" podCreationTimestamp="2026-04-22 18:36:14 +0000 UTC" firstStartedPulling="2026-04-22 18:36:15.819435516 +0000 UTC m=+1.845049167" lastFinishedPulling="2026-04-22 18:36:36.270994138 +0000 UTC m=+22.296607797" observedRunningTime="2026-04-22 18:36:36.793643457 +0000 UTC m=+22.819257124" watchObservedRunningTime="2026-04-22 18:36:36.793751229 +0000 UTC m=+22.819364907" Apr 22 18:36:37.598830 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:37.598785 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:37.599022 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:37.598785 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:37.599022 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:37.598931 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:37.599140 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:37.599035 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:38.289620 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:38.289579 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:39.014963 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.014932 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:39.015624 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.015606 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:39.599471 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.599277 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:39.599869 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.599304 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:39.599869 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:39.599578 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:39.599869 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:39.599608 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:39.713940 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.713907 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-q9vgr"] Apr 22 18:36:39.723128 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.723102 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q9vgr" Apr 22 18:36:39.725975 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.725951 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:36:39.726101 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.725950 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:36:39.726101 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.726093 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-44zj9\"" Apr 22 18:36:39.780292 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.780264 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 18:36:39.780923 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.780757 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" event={"ID":"97302881-ab30-4630-9df7-e7796d6aaedf","Type":"ContainerStarted","Data":"b86291b3b63f5b41fc5f54b773511d7f6c53648db255a828667c613c55d49b1a"} Apr 22 18:36:39.781366 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.781344 2577 scope.go:117] "RemoveContainer" containerID="85f7ed52a04fb5584cf66189c848ea4cb54434c2ead3f3eff11f153792373777" Apr 22 18:36:39.781704 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.781684 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-w5xvx" Apr 22 18:36:39.865640 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.865618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5f2ded72-fa8f-446c-863c-2699e86ba162-hosts-file\") pod \"node-resolver-q9vgr\" (UID: \"5f2ded72-fa8f-446c-863c-2699e86ba162\") " pod="openshift-dns/node-resolver-q9vgr" Apr 22 18:36:39.865750 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.865666 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f2ded72-fa8f-446c-863c-2699e86ba162-tmp-dir\") pod \"node-resolver-q9vgr\" (UID: \"5f2ded72-fa8f-446c-863c-2699e86ba162\") " pod="openshift-dns/node-resolver-q9vgr" Apr 22 18:36:39.865750 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.865688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlbvm\" (UniqueName: \"kubernetes.io/projected/5f2ded72-fa8f-446c-863c-2699e86ba162-kube-api-access-dlbvm\") pod \"node-resolver-q9vgr\" (UID: \"5f2ded72-fa8f-446c-863c-2699e86ba162\") " pod="openshift-dns/node-resolver-q9vgr" Apr 22 18:36:39.966343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.966280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5f2ded72-fa8f-446c-863c-2699e86ba162-hosts-file\") pod \"node-resolver-q9vgr\" (UID: \"5f2ded72-fa8f-446c-863c-2699e86ba162\") " pod="openshift-dns/node-resolver-q9vgr" Apr 22 18:36:39.966343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.966314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f2ded72-fa8f-446c-863c-2699e86ba162-tmp-dir\") pod \"node-resolver-q9vgr\" (UID: \"5f2ded72-fa8f-446c-863c-2699e86ba162\") " pod="openshift-dns/node-resolver-q9vgr" Apr 22 18:36:39.966343 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.966331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlbvm\" (UniqueName: \"kubernetes.io/projected/5f2ded72-fa8f-446c-863c-2699e86ba162-kube-api-access-dlbvm\") pod \"node-resolver-q9vgr\" (UID: \"5f2ded72-fa8f-446c-863c-2699e86ba162\") " pod="openshift-dns/node-resolver-q9vgr" Apr 22 18:36:39.966509 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.966414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5f2ded72-fa8f-446c-863c-2699e86ba162-hosts-file\") pod \"node-resolver-q9vgr\" (UID: \"5f2ded72-fa8f-446c-863c-2699e86ba162\") " pod="openshift-dns/node-resolver-q9vgr" Apr 22 18:36:39.966602 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.966582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f2ded72-fa8f-446c-863c-2699e86ba162-tmp-dir\") pod \"node-resolver-q9vgr\" (UID: \"5f2ded72-fa8f-446c-863c-2699e86ba162\") " pod="openshift-dns/node-resolver-q9vgr" Apr 22 18:36:39.977632 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:39.977601 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlbvm\" (UniqueName: \"kubernetes.io/projected/5f2ded72-fa8f-446c-863c-2699e86ba162-kube-api-access-dlbvm\") pod \"node-resolver-q9vgr\" (UID: \"5f2ded72-fa8f-446c-863c-2699e86ba162\") " pod="openshift-dns/node-resolver-q9vgr" Apr 22 18:36:40.032563 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.032533 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q9vgr" Apr 22 18:36:40.039736 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:40.039709 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f2ded72_fa8f_446c_863c_2699e86ba162.slice/crio-328f7228e4bc0ff80ea9c6b3fdaeb9c0d9b5865e2e1e7302f1b99b9b1b770477 WatchSource:0}: Error finding container 328f7228e4bc0ff80ea9c6b3fdaeb9c0d9b5865e2e1e7302f1b99b9b1b770477: Status 404 returned error can't find the container with id 328f7228e4bc0ff80ea9c6b3fdaeb9c0d9b5865e2e1e7302f1b99b9b1b770477 Apr 22 18:36:40.784160 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.783941 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q9vgr" event={"ID":"5f2ded72-fa8f-446c-863c-2699e86ba162","Type":"ContainerStarted","Data":"64f19f3b28f1ce7d150d776d188de60305875a06e26451960b63f897677e275a"} Apr 22 18:36:40.785148 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.784177 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q9vgr" event={"ID":"5f2ded72-fa8f-446c-863c-2699e86ba162","Type":"ContainerStarted","Data":"328f7228e4bc0ff80ea9c6b3fdaeb9c0d9b5865e2e1e7302f1b99b9b1b770477"} Apr 22 18:36:40.787952 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.787923 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 18:36:40.788363 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.788328 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" event={"ID":"97302881-ab30-4630-9df7-e7796d6aaedf","Type":"ContainerStarted","Data":"f8faca544ba30a351724060ca9fa3cfcb42eb8dce0842010d2580039e195ab24"} Apr 22 18:36:40.788944 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.788915 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:40.788944 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.788947 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:40.789081 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.788960 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:40.791696 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.791600 2577 generic.go:358] "Generic (PLEG): container finished" podID="1ca4dacd-dae4-4a9c-a9f6-96d152edbeac" containerID="245c0ba3d37e58c4022dd0d978210326549308832108d47db2e9bd094d5302ea" exitCode=0 Apr 22 18:36:40.791867 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.791834 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knnz5" event={"ID":"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac","Type":"ContainerDied","Data":"245c0ba3d37e58c4022dd0d978210326549308832108d47db2e9bd094d5302ea"} Apr 22 18:36:40.808993 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.808964 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:40.809143 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.809069 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:36:40.839454 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.839384 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-q9vgr" podStartSLOduration=1.8393605370000001 podStartE2EDuration="1.839360537s" podCreationTimestamp="2026-04-22 18:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:40.804004569 +0000 UTC m=+26.829618236" watchObservedRunningTime="2026-04-22 18:36:40.839360537 +0000 UTC m=+26.864974210" Apr 22 18:36:40.871569 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:40.871508 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" podStartSLOduration=9.035768104 podStartE2EDuration="26.871489197s" podCreationTimestamp="2026-04-22 18:36:14 +0000 UTC" firstStartedPulling="2026-04-22 18:36:15.886628266 +0000 UTC m=+1.912241912" lastFinishedPulling="2026-04-22 18:36:33.72234935 +0000 UTC m=+19.747963005" observedRunningTime="2026-04-22 18:36:40.871049413 +0000 UTC m=+26.896663080" watchObservedRunningTime="2026-04-22 18:36:40.871489197 +0000 UTC m=+26.897102865" Apr 22 18:36:41.505979 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:41.505952 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dt6rs"] Apr 22 18:36:41.506133 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:41.506091 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:41.506228 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:41.506205 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:41.506818 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:41.506790 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9t8m2"] Apr 22 18:36:41.506982 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:41.506923 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:41.507096 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:41.507070 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:41.795417 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:41.795327 2577 generic.go:358] "Generic (PLEG): container finished" podID="1ca4dacd-dae4-4a9c-a9f6-96d152edbeac" containerID="566565f6c0b6f3497b42dc784c8c2a8586d795d23027ff969f356b60ad39a5d5" exitCode=0 Apr 22 18:36:41.795826 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:41.795405 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knnz5" event={"ID":"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac","Type":"ContainerDied","Data":"566565f6c0b6f3497b42dc784c8c2a8586d795d23027ff969f356b60ad39a5d5"} Apr 22 18:36:42.802367 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:42.802135 2577 generic.go:358] "Generic (PLEG): container finished" podID="1ca4dacd-dae4-4a9c-a9f6-96d152edbeac" containerID="0bbb97d519413f6abd3d83a6ff2805a8f9af652927cb2e19f30ea996a31184ce" exitCode=0 Apr 22 18:36:42.802367 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:42.802222 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knnz5" event={"ID":"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac","Type":"ContainerDied","Data":"0bbb97d519413f6abd3d83a6ff2805a8f9af652927cb2e19f30ea996a31184ce"} Apr 22 18:36:43.598631 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:43.598590 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:43.598901 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:43.598590 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:43.598901 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:43.598741 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:43.598901 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:43.598775 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:45.204730 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.204692 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-98mhg"] Apr 22 18:36:45.210929 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.210904 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:45.211085 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:45.210992 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-98mhg" podUID="ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0" Apr 22 18:36:45.216489 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.216458 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-98mhg"] Apr 22 18:36:45.301686 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.301633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-dbus\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:45.301879 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.301695 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:45.301879 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.301727 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-kubelet-config\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:45.402630 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.402589 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:45.402831 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.402644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-kubelet-config\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:45.402831 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.402748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-dbus\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:45.402831 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:45.402767 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:45.402831 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.402800 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-kubelet-config\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:45.403008 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:45.402847 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret podName:ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:45.902825289 +0000 UTC m=+31.928438937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret") pod "global-pull-secret-syncer-98mhg" (UID: "ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:45.403008 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.402991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-dbus\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:45.599432 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.599394 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:45.599626 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.599393 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:45.599626 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:45.599537 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9t8m2" podUID="617c16ac-507f-45a2-ab75-d583c7798ca1" Apr 22 18:36:45.599626 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:45.599605 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dt6rs" podUID="a97e9a8d-908d-409a-b079-4fee4c52cdcd" Apr 22 18:36:45.807971 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.807900 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:45.808119 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:45.808031 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-98mhg" podUID="ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0" Apr 22 18:36:45.821641 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.821610 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-204.ec2.internal" event="NodeReady" Apr 22 18:36:45.821819 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.821812 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:36:45.869687 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.869567 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66658f9bb6-7qttb"] Apr 22 18:36:45.896167 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.896133 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66658f9bb6-7qttb"] Apr 22 18:36:45.896167 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.896170 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-msgd4"] Apr 22 18:36:45.896390 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.896243 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:45.899821 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.899788 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:36:45.900015 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.899990 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:36:45.900126 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.900113 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8gvfs\"" Apr 22 18:36:45.900349 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.900331 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:36:45.905125 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.904673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:45.905125 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:45.904803 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:45.905125 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:45.904860 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret podName:ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:46.904841971 +0000 UTC m=+32.930455616 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret") pod "global-pull-secret-syncer-98mhg" (UID: "ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:45.917086 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.916581 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k9tbm"] Apr 22 18:36:45.917086 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.916793 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:45.918947 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.918919 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:36:45.919590 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.919560 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:36:45.919910 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.919889 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:36:45.920326 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.920309 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xfhxc\"" Apr 22 18:36:45.941017 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.940990 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-msgd4"] Apr 22 18:36:45.941017 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.941024 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k9tbm"] Apr 22 18:36:45.941249 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.941137 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:36:45.943906 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.943873 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:36:45.944030 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.943925 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:36:45.944030 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.944001 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:36:45.944143 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:45.944046 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x8r7b\"" Apr 22 18:36:46.005354 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e36af8a-7247-492e-8451-b00362b2dcac-config-volume\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.005354 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9e36af8a-7247-492e-8451-b00362b2dcac-tmp-dir\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.005590 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:36:46.005590 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-registry-certificates\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.005590 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcrg2\" (UniqueName: \"kubernetes.io/projected/9e36af8a-7247-492e-8451-b00362b2dcac-kube-api-access-tcrg2\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.005590 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005561 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.005772 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-trusted-ca\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.005772 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9h9\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-kube-api-access-7f9h9\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.005772 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.005772 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005717 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-installation-pull-secrets\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.005772 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005763 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-image-registry-private-configuration\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.005960 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a4f3346-013e-4759-8acd-a577393f5c37-ca-trust-extracted\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.005960 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-bound-sa-token\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.005960 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.005874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg94f\" (UniqueName: \"kubernetes.io/projected/3d556130-66df-4a4c-baae-5f7294e0bc35-kube-api-access-qg94f\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:36:46.106704 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.106588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.106704 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.106672 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-trusted-ca\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.106704 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.106702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9h9\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-kube-api-access-7f9h9\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.106983 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.106735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.106983 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.106766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-installation-pull-secrets\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.106983 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.106783 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:46.106983 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.106806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-image-registry-private-configuration\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.106983 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.106843 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:46.106983 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.106862 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66658f9bb6-7qttb: secret "image-registry-tls" not found Apr 22 18:36:46.106983 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.106868 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls podName:9e36af8a-7247-492e-8451-b00362b2dcac nodeName:}" failed. No retries permitted until 2026-04-22 18:36:46.606847819 +0000 UTC m=+32.632461470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls") pod "dns-default-msgd4" (UID: "9e36af8a-7247-492e-8451-b00362b2dcac") : secret "dns-default-metrics-tls" not found Apr 22 18:36:46.106983 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.106912 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls podName:0a4f3346-013e-4759-8acd-a577393f5c37 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:46.606894815 +0000 UTC m=+32.632508459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls") pod "image-registry-66658f9bb6-7qttb" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37") : secret "image-registry-tls" not found Apr 22 18:36:46.107372 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.106976 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a4f3346-013e-4759-8acd-a577393f5c37-ca-trust-extracted\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.107372 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-bound-sa-token\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.107372 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qg94f\" (UniqueName: \"kubernetes.io/projected/3d556130-66df-4a4c-baae-5f7294e0bc35-kube-api-access-qg94f\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:36:46.107372 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e36af8a-7247-492e-8451-b00362b2dcac-config-volume\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.107372 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9e36af8a-7247-492e-8451-b00362b2dcac-tmp-dir\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.107372 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:36:46.107372 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107183 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-registry-certificates\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.107372 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcrg2\" (UniqueName: \"kubernetes.io/projected/9e36af8a-7247-492e-8451-b00362b2dcac-kube-api-access-tcrg2\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.107788 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a4f3346-013e-4759-8acd-a577393f5c37-ca-trust-extracted\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.107788 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.107479 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:46.107788 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.107519 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert podName:3d556130-66df-4a4c-baae-5f7294e0bc35 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:46.607506001 +0000 UTC m=+32.633119658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert") pod "ingress-canary-k9tbm" (UID: "3d556130-66df-4a4c-baae-5f7294e0bc35") : secret "canary-serving-cert" not found Apr 22 18:36:46.107788 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9e36af8a-7247-492e-8451-b00362b2dcac-tmp-dir\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.107983 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107936 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-registry-certificates\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.107983 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107962 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-trusted-ca\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.108072 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.107996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e36af8a-7247-492e-8451-b00362b2dcac-config-volume\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.111672 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.111605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-image-registry-private-configuration\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.111672 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.111617 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-installation-pull-secrets\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.117233 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.117207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-bound-sa-token\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.117384 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.117296 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9h9\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-kube-api-access-7f9h9\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.117384 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.117336 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcrg2\" (UniqueName: \"kubernetes.io/projected/9e36af8a-7247-492e-8451-b00362b2dcac-kube-api-access-tcrg2\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.118219 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.118194 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg94f\" (UniqueName: \"kubernetes.io/projected/3d556130-66df-4a4c-baae-5f7294e0bc35-kube-api-access-qg94f\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:36:46.610876 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.610842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:36:46.611500 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.610896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:46.611500 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.610957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:46.611500 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.611012 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:46.611500 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.611071 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls podName:9e36af8a-7247-492e-8451-b00362b2dcac nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.611053051 +0000 UTC m=+33.636666696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls") pod "dns-default-msgd4" (UID: "9e36af8a-7247-492e-8451-b00362b2dcac") : secret "dns-default-metrics-tls" not found Apr 22 18:36:46.611500 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.611073 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:46.611500 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.611086 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66658f9bb6-7qttb: secret "image-registry-tls" not found Apr 22 18:36:46.611500 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.611117 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls podName:0a4f3346-013e-4759-8acd-a577393f5c37 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.611106941 +0000 UTC m=+33.636720594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls") pod "image-registry-66658f9bb6-7qttb" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37") : secret "image-registry-tls" not found Apr 22 18:36:46.611500 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.611174 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:46.611500 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.611241 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert podName:3d556130-66df-4a4c-baae-5f7294e0bc35 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.611225217 +0000 UTC m=+33.636838867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert") pod "ingress-canary-k9tbm" (UID: "3d556130-66df-4a4c-baae-5f7294e0bc35") : secret "canary-serving-cert" not found Apr 22 18:36:46.913684 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:46.913565 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:46.913856 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.913758 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:46.913924 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:46.913874 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret podName:ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:48.913854106 +0000 UTC m=+34.939467761 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret") pod "global-pull-secret-syncer-98mhg" (UID: "ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:47.215517 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.215469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:47.215711 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.215639 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:47.215770 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.215722 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs podName:617c16ac-507f-45a2-ab75-d583c7798ca1 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:19.215707068 +0000 UTC m=+65.241320716 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs") pod "network-metrics-daemon-9t8m2" (UID: "617c16ac-507f-45a2-ab75-d583c7798ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:47.299335 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.299296 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj"] Apr 22 18:36:47.316143 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.316108 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnbzs\" (UniqueName: \"kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs\") pod \"network-check-target-dt6rs\" (UID: \"a97e9a8d-908d-409a-b079-4fee4c52cdcd\") " pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:47.316316 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.316282 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:47.316316 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.316305 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:47.316411 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.316318 2577 projected.go:194] Error preparing data for projected volume kube-api-access-dnbzs for pod openshift-network-diagnostics/network-check-target-dt6rs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:47.316411 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.316387 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs podName:a97e9a8d-908d-409a-b079-4fee4c52cdcd nodeName:}" failed. No retries permitted until 2026-04-22 18:37:19.316364374 +0000 UTC m=+65.341978021 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnbzs" (UniqueName: "kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs") pod "network-check-target-dt6rs" (UID: "a97e9a8d-908d-409a-b079-4fee4c52cdcd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:47.332301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.332275 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj"] Apr 22 18:36:47.332435 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.332398 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:36:47.338476 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.338415 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 18:36:47.338757 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.338641 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-8vwk8\"" Apr 22 18:36:47.339701 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.339679 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 18:36:47.416780 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.416744 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26b34a86-850d-4fe7-87a3-10448b9b73a3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:36:47.416780 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.416784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:36:47.517445 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.517364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26b34a86-850d-4fe7-87a3-10448b9b73a3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:36:47.517445 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.517405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:36:47.517686 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.517605 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:36:47.517747 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.517720 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert podName:26b34a86-850d-4fe7-87a3-10448b9b73a3 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:48.017696269 +0000 UTC m=+34.043309928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngrmj" (UID: "26b34a86-850d-4fe7-87a3-10448b9b73a3") : secret "networking-console-plugin-cert" not found Apr 22 18:36:47.518064 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.518044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26b34a86-850d-4fe7-87a3-10448b9b73a3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:36:47.599321 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.599263 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:47.599495 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.599275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:36:47.599495 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.599275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:36:47.602723 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.602421 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:36:47.602723 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.602708 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-95dwn\"" Apr 22 18:36:47.603036 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.603019 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:36:47.603811 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.603407 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zwsxn\"" Apr 22 18:36:47.603811 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.603594 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:36:47.603811 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.603711 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:36:47.618629 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.618601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:47.619100 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.618685 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:47.619100 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.618730 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:47.619100 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:47.618781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:36:47.619100 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.618796 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls podName:9e36af8a-7247-492e-8451-b00362b2dcac nodeName:}" failed. No retries permitted until 2026-04-22 18:36:49.618775814 +0000 UTC m=+35.644389479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls") pod "dns-default-msgd4" (UID: "9e36af8a-7247-492e-8451-b00362b2dcac") : secret "dns-default-metrics-tls" not found Apr 22 18:36:47.619100 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.618842 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:47.619100 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.618858 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:47.619100 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.618859 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66658f9bb6-7qttb: secret "image-registry-tls" not found Apr 22 18:36:47.619100 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.618903 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert podName:3d556130-66df-4a4c-baae-5f7294e0bc35 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:49.618888128 +0000 UTC m=+35.644501776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert") pod "ingress-canary-k9tbm" (UID: "3d556130-66df-4a4c-baae-5f7294e0bc35") : secret "canary-serving-cert" not found Apr 22 18:36:47.619100 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:47.618980 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls podName:0a4f3346-013e-4759-8acd-a577393f5c37 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:49.618960381 +0000 UTC m=+35.644574028 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls") pod "image-registry-66658f9bb6-7qttb" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37") : secret "image-registry-tls" not found Apr 22 18:36:48.022111 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:48.022069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:36:48.022296 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:48.022242 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:36:48.022340 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:48.022311 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert podName:26b34a86-850d-4fe7-87a3-10448b9b73a3 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:49.022295925 +0000 UTC m=+35.047909570 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngrmj" (UID: "26b34a86-850d-4fe7-87a3-10448b9b73a3") : secret "networking-console-plugin-cert" not found Apr 22 18:36:48.928800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:48.928762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:48.931222 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:48.931191 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0-original-pull-secret\") pod \"global-pull-secret-syncer-98mhg\" (UID: \"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0\") " pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:49.029797 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:49.029755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:36:49.029964 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:49.029918 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:36:49.030022 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:49.029993 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert podName:26b34a86-850d-4fe7-87a3-10448b9b73a3 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:51.029971853 +0000 UTC m=+37.055585499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngrmj" (UID: "26b34a86-850d-4fe7-87a3-10448b9b73a3") : secret "networking-console-plugin-cert" not found Apr 22 18:36:49.112025 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:49.111991 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-98mhg" Apr 22 18:36:49.278522 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:49.278275 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-98mhg"] Apr 22 18:36:49.281689 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:49.281637 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab4eb5fe_f1c7_4f29_886f_c980c0fff0e0.slice/crio-d04bf5372a119bd456d84896659f14a2cec7bfb6f129dfc686ab9d6deafb2b8f WatchSource:0}: Error finding container d04bf5372a119bd456d84896659f14a2cec7bfb6f129dfc686ab9d6deafb2b8f: Status 404 returned error can't find the container with id d04bf5372a119bd456d84896659f14a2cec7bfb6f129dfc686ab9d6deafb2b8f Apr 22 18:36:49.634333 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:49.634243 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:36:49.634333 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:49.634296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:49.634507 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:49.634390 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:49.634507 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:49.634399 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:49.634507 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:49.634439 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls podName:9e36af8a-7247-492e-8451-b00362b2dcac nodeName:}" failed. No retries permitted until 2026-04-22 18:36:53.634426367 +0000 UTC m=+39.660040012 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls") pod "dns-default-msgd4" (UID: "9e36af8a-7247-492e-8451-b00362b2dcac") : secret "dns-default-metrics-tls" not found Apr 22 18:36:49.634507 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:49.634428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:49.634507 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:49.634478 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert podName:3d556130-66df-4a4c-baae-5f7294e0bc35 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:53.634457358 +0000 UTC m=+39.660071006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert") pod "ingress-canary-k9tbm" (UID: "3d556130-66df-4a4c-baae-5f7294e0bc35") : secret "canary-serving-cert" not found Apr 22 18:36:49.634720 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:49.634549 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:49.634720 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:49.634565 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66658f9bb6-7qttb: secret "image-registry-tls" not found Apr 22 18:36:49.634720 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:49.634605 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls podName:0a4f3346-013e-4759-8acd-a577393f5c37 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:53.634592757 +0000 UTC m=+39.660206406 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls") pod "image-registry-66658f9bb6-7qttb" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37") : secret "image-registry-tls" not found Apr 22 18:36:49.819020 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:49.818985 2577 generic.go:358] "Generic (PLEG): container finished" podID="1ca4dacd-dae4-4a9c-a9f6-96d152edbeac" containerID="e0fe050023ed861c71bf0f66e69988b3741795a4e1b236bb7b23d2b9dfbf8564" exitCode=0 Apr 22 18:36:49.819181 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:49.819072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knnz5" event={"ID":"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac","Type":"ContainerDied","Data":"e0fe050023ed861c71bf0f66e69988b3741795a4e1b236bb7b23d2b9dfbf8564"} Apr 22 18:36:49.820248 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:49.820205 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-98mhg" event={"ID":"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0","Type":"ContainerStarted","Data":"d04bf5372a119bd456d84896659f14a2cec7bfb6f129dfc686ab9d6deafb2b8f"} Apr 22 18:36:50.825799 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:50.825760 2577 generic.go:358] "Generic (PLEG): container finished" podID="1ca4dacd-dae4-4a9c-a9f6-96d152edbeac" containerID="cb52e0d7ad2686a1e12f9772ab1e71c1e960b0270c168e1f28e8132378dbac6f" exitCode=0 Apr 22 18:36:50.826327 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:50.825822 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knnz5" event={"ID":"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac","Type":"ContainerDied","Data":"cb52e0d7ad2686a1e12f9772ab1e71c1e960b0270c168e1f28e8132378dbac6f"} Apr 22 18:36:51.046342 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:51.046291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:36:51.046528 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:51.046467 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:36:51.046588 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:51.046540 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert podName:26b34a86-850d-4fe7-87a3-10448b9b73a3 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:55.046523938 +0000 UTC m=+41.072137588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngrmj" (UID: "26b34a86-850d-4fe7-87a3-10448b9b73a3") : secret "networking-console-plugin-cert" not found Apr 22 18:36:51.831361 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:51.831148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knnz5" event={"ID":"1ca4dacd-dae4-4a9c-a9f6-96d152edbeac","Type":"ContainerStarted","Data":"87f607483fc4ec2b982231ec34b0941a1297e4de63a59b32e1015fb478b42bca"} Apr 22 18:36:51.870880 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:51.870816 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-knnz5" podStartSLOduration=4.751216907 podStartE2EDuration="37.870797942s" podCreationTimestamp="2026-04-22 18:36:14 +0000 UTC" firstStartedPulling="2026-04-22 18:36:15.86116272 +0000 UTC m=+1.886776364" lastFinishedPulling="2026-04-22 18:36:48.980743753 +0000 UTC m=+35.006357399" observedRunningTime="2026-04-22 18:36:51.870011967 +0000 UTC m=+37.895625636" watchObservedRunningTime="2026-04-22 18:36:51.870797942 +0000 UTC m=+37.896411609" Apr 22 18:36:53.671741 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:53.671632 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:36:53.671741 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:53.671706 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:36:53.672166 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:53.671753 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:53.672166 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:53.671816 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls podName:9e36af8a-7247-492e-8451-b00362b2dcac nodeName:}" failed. No retries permitted until 2026-04-22 18:37:01.671799103 +0000 UTC m=+47.697412747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls") pod "dns-default-msgd4" (UID: "9e36af8a-7247-492e-8451-b00362b2dcac") : secret "dns-default-metrics-tls" not found Apr 22 18:36:53.672166 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:53.671821 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:53.672166 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:53.671760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:36:53.672166 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:53.671858 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert podName:3d556130-66df-4a4c-baae-5f7294e0bc35 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:01.67184704 +0000 UTC m=+47.697460685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert") pod "ingress-canary-k9tbm" (UID: "3d556130-66df-4a4c-baae-5f7294e0bc35") : secret "canary-serving-cert" not found Apr 22 18:36:53.672166 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:53.671860 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:53.672166 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:53.671885 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66658f9bb6-7qttb: secret "image-registry-tls" not found Apr 22 18:36:53.672166 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:53.671929 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls podName:0a4f3346-013e-4759-8acd-a577393f5c37 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:01.671915139 +0000 UTC m=+47.697528787 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls") pod "image-registry-66658f9bb6-7qttb" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37") : secret "image-registry-tls" not found Apr 22 18:36:53.836445 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:53.836408 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-98mhg" event={"ID":"ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0","Type":"ContainerStarted","Data":"137aca3cfd19b5ad843bd0407a65c70f28f3b9ce09a9f47e002d50a612d26d56"} Apr 22 18:36:53.854059 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:53.854011 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-98mhg" podStartSLOduration=4.785287707 podStartE2EDuration="8.853993131s" podCreationTimestamp="2026-04-22 18:36:45 +0000 UTC" firstStartedPulling="2026-04-22 18:36:49.283541073 +0000 UTC m=+35.309154717" lastFinishedPulling="2026-04-22 18:36:53.352246492 +0000 UTC m=+39.377860141" observedRunningTime="2026-04-22 18:36:53.853399615 +0000 UTC m=+39.879013281" watchObservedRunningTime="2026-04-22 18:36:53.853993131 +0000 UTC m=+39.879606776" Apr 22 18:36:55.082251 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:55.082203 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:36:55.082696 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:55.082356 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:36:55.082696 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:36:55.082426 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert podName:26b34a86-850d-4fe7-87a3-10448b9b73a3 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:03.082408182 +0000 UTC m=+49.108021827 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngrmj" (UID: "26b34a86-850d-4fe7-87a3-10448b9b73a3") : secret "networking-console-plugin-cert" not found Apr 22 18:36:58.339447 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.339415 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q9vgr_5f2ded72-fa8f-446c-863c-2699e86ba162/dns-node-resolver/0.log" Apr 22 18:36:58.379515 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.379479 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8"] Apr 22 18:36:58.420265 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.420223 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8"] Apr 22 18:36:58.420427 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.420354 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8" Apr 22 18:36:58.423295 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.423271 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:58.423439 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.423362 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 18:36:58.424573 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.424545 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-gh2rz\"" Apr 22 18:36:58.510403 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.510364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhv5d\" (UniqueName: \"kubernetes.io/projected/cc3ce331-86aa-4c86-b73e-a6350ae8d3a1-kube-api-access-zhv5d\") pod \"migrator-74bb7799d9-98ph8\" (UID: \"cc3ce331-86aa-4c86-b73e-a6350ae8d3a1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8" Apr 22 18:36:58.610910 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.610826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhv5d\" (UniqueName: \"kubernetes.io/projected/cc3ce331-86aa-4c86-b73e-a6350ae8d3a1-kube-api-access-zhv5d\") pod \"migrator-74bb7799d9-98ph8\" (UID: \"cc3ce331-86aa-4c86-b73e-a6350ae8d3a1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8" Apr 22 18:36:58.622587 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.622546 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhv5d\" (UniqueName: \"kubernetes.io/projected/cc3ce331-86aa-4c86-b73e-a6350ae8d3a1-kube-api-access-zhv5d\") pod \"migrator-74bb7799d9-98ph8\" (UID: \"cc3ce331-86aa-4c86-b73e-a6350ae8d3a1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8" Apr 22 18:36:58.729969 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.729930 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8" Apr 22 18:36:58.873674 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.873571 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8"] Apr 22 18:36:58.887668 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:58.887623 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc3ce331_86aa_4c86_b73e_a6350ae8d3a1.slice/crio-8a52a3ee4027cd9633694a68e55ae411f1488859333b62161d4c8b0fe10d2896 WatchSource:0}: Error finding container 8a52a3ee4027cd9633694a68e55ae411f1488859333b62161d4c8b0fe10d2896: Status 404 returned error can't find the container with id 8a52a3ee4027cd9633694a68e55ae411f1488859333b62161d4c8b0fe10d2896 Apr 22 18:36:58.959619 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.959589 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kb89l"] Apr 22 18:36:58.981944 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.981909 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kb89l"] Apr 22 18:36:58.982098 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.982024 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kb89l" Apr 22 18:36:58.984945 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.984919 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 18:36:58.985087 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.985026 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 18:36:58.985087 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.985052 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-xdq87\"" Apr 22 18:36:58.985087 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.985029 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 18:36:58.986092 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:58.986068 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 18:36:59.115322 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.115278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4rf8\" (UniqueName: \"kubernetes.io/projected/2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5-kube-api-access-t4rf8\") pod \"service-ca-865cb79987-kb89l\" (UID: \"2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5\") " pod="openshift-service-ca/service-ca-865cb79987-kb89l" Apr 22 18:36:59.115322 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.115324 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5-signing-key\") pod \"service-ca-865cb79987-kb89l\" (UID: \"2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5\") " pod="openshift-service-ca/service-ca-865cb79987-kb89l" Apr 22 18:36:59.115537 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.115357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5-signing-cabundle\") pod \"service-ca-865cb79987-kb89l\" (UID: \"2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5\") " pod="openshift-service-ca/service-ca-865cb79987-kb89l" Apr 22 18:36:59.216489 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.216458 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4rf8\" (UniqueName: \"kubernetes.io/projected/2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5-kube-api-access-t4rf8\") pod \"service-ca-865cb79987-kb89l\" (UID: \"2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5\") " pod="openshift-service-ca/service-ca-865cb79987-kb89l" Apr 22 18:36:59.216717 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.216499 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5-signing-key\") pod \"service-ca-865cb79987-kb89l\" (UID: \"2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5\") " pod="openshift-service-ca/service-ca-865cb79987-kb89l" Apr 22 18:36:59.216717 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.216524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5-signing-cabundle\") pod \"service-ca-865cb79987-kb89l\" (UID: \"2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5\") " pod="openshift-service-ca/service-ca-865cb79987-kb89l" Apr 22 18:36:59.217203 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.217182 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5-signing-cabundle\") pod \"service-ca-865cb79987-kb89l\" (UID: \"2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5\") " pod="openshift-service-ca/service-ca-865cb79987-kb89l" Apr 22 18:36:59.219533 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.219274 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5-signing-key\") pod \"service-ca-865cb79987-kb89l\" (UID: \"2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5\") " pod="openshift-service-ca/service-ca-865cb79987-kb89l" Apr 22 18:36:59.227822 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.227799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4rf8\" (UniqueName: \"kubernetes.io/projected/2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5-kube-api-access-t4rf8\") pod \"service-ca-865cb79987-kb89l\" (UID: \"2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5\") " pod="openshift-service-ca/service-ca-865cb79987-kb89l" Apr 22 18:36:59.291776 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.291742 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kb89l" Apr 22 18:36:59.339784 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.339760 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fgx5l_5fcd2fa1-b6e0-4480-a123-665c7fabd2bf/node-ca/0.log" Apr 22 18:36:59.405268 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.405234 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kb89l"] Apr 22 18:36:59.408052 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:36:59.408026 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b96ab94_7d82_45fd_ab9c_7c16c2e91ee5.slice/crio-267dc1f8181b09a4fc851b5a0530b6ba1b311f23e521336755b5516b5b7d63c6 WatchSource:0}: Error finding container 267dc1f8181b09a4fc851b5a0530b6ba1b311f23e521336755b5516b5b7d63c6: Status 404 returned error can't find the container with id 267dc1f8181b09a4fc851b5a0530b6ba1b311f23e521336755b5516b5b7d63c6 Apr 22 18:36:59.849785 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.849707 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kb89l" event={"ID":"2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5","Type":"ContainerStarted","Data":"267dc1f8181b09a4fc851b5a0530b6ba1b311f23e521336755b5516b5b7d63c6"} Apr 22 18:36:59.850914 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:36:59.850887 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8" event={"ID":"cc3ce331-86aa-4c86-b73e-a6350ae8d3a1","Type":"ContainerStarted","Data":"8a52a3ee4027cd9633694a68e55ae411f1488859333b62161d4c8b0fe10d2896"} Apr 22 18:37:01.736190 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:01.736152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:37:01.736633 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:01.736244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:37:01.736633 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:01.736299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:37:01.736633 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:01.736323 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:37:01.736633 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:01.736391 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:37:01.736633 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:01.736410 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:37:01.736633 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:01.736430 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66658f9bb6-7qttb: secret "image-registry-tls" not found Apr 22 18:37:01.736633 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:01.736412 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls podName:9e36af8a-7247-492e-8451-b00362b2dcac nodeName:}" failed. No retries permitted until 2026-04-22 18:37:17.736389299 +0000 UTC m=+63.762002943 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls") pod "dns-default-msgd4" (UID: "9e36af8a-7247-492e-8451-b00362b2dcac") : secret "dns-default-metrics-tls" not found Apr 22 18:37:01.736633 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:01.736495 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert podName:3d556130-66df-4a4c-baae-5f7294e0bc35 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:17.73647594 +0000 UTC m=+63.762089585 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert") pod "ingress-canary-k9tbm" (UID: "3d556130-66df-4a4c-baae-5f7294e0bc35") : secret "canary-serving-cert" not found Apr 22 18:37:01.736633 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:01.736512 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls podName:0a4f3346-013e-4759-8acd-a577393f5c37 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:17.736502744 +0000 UTC m=+63.762116397 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls") pod "image-registry-66658f9bb6-7qttb" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37") : secret "image-registry-tls" not found Apr 22 18:37:01.856935 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:01.856851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8" event={"ID":"cc3ce331-86aa-4c86-b73e-a6350ae8d3a1","Type":"ContainerStarted","Data":"8f3d542df38f59e814eded0f34fa8b7be6371d65dbaa8019c195f827b0e334d6"} Apr 22 18:37:01.856935 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:01.856892 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8" event={"ID":"cc3ce331-86aa-4c86-b73e-a6350ae8d3a1","Type":"ContainerStarted","Data":"3acdb1a226016a29b9c7bf7f21c89e434de8ed6d6ae43f80909578d7cd5520d6"} Apr 22 18:37:01.878522 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:01.878473 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-98ph8" podStartSLOduration=1.969588326 podStartE2EDuration="3.878457645s" podCreationTimestamp="2026-04-22 18:36:58 +0000 UTC" firstStartedPulling="2026-04-22 18:36:58.889486699 +0000 UTC m=+44.915100348" lastFinishedPulling="2026-04-22 18:37:00.798356009 +0000 UTC m=+46.823969667" observedRunningTime="2026-04-22 18:37:01.878150021 +0000 UTC m=+47.903763689" watchObservedRunningTime="2026-04-22 18:37:01.878457645 +0000 UTC m=+47.904071361" Apr 22 18:37:02.860076 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:02.860038 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kb89l" event={"ID":"2b96ab94-7d82-45fd-ab9c-7c16c2e91ee5","Type":"ContainerStarted","Data":"8e7072ec43578190b877492c2a20fc95c13a65ecc565daf98a2d1a2db0471134"} Apr 22 18:37:02.888514 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:02.888455 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-kb89l" podStartSLOduration=2.409659987 podStartE2EDuration="4.888440525s" podCreationTimestamp="2026-04-22 18:36:58 +0000 UTC" firstStartedPulling="2026-04-22 18:36:59.409844973 +0000 UTC m=+45.435458619" lastFinishedPulling="2026-04-22 18:37:01.888625501 +0000 UTC m=+47.914239157" observedRunningTime="2026-04-22 18:37:02.887703848 +0000 UTC m=+48.913317515" watchObservedRunningTime="2026-04-22 18:37:02.888440525 +0000 UTC m=+48.914054192" Apr 22 18:37:03.147442 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:03.147351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:37:03.147594 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:03.147473 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:37:03.147594 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:03.147526 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert podName:26b34a86-850d-4fe7-87a3-10448b9b73a3 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:19.147512105 +0000 UTC m=+65.173125749 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngrmj" (UID: "26b34a86-850d-4fe7-87a3-10448b9b73a3") : secret "networking-console-plugin-cert" not found Apr 22 18:37:12.817222 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:12.817182 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r65js" Apr 22 18:37:17.760546 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:17.760500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:37:17.760546 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:17.760555 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:37:17.761099 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:17.760607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:37:17.763242 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:17.763022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e36af8a-7247-492e-8451-b00362b2dcac-metrics-tls\") pod \"dns-default-msgd4\" (UID: \"9e36af8a-7247-492e-8451-b00362b2dcac\") " pod="openshift-dns/dns-default-msgd4" Apr 22 18:37:17.763406 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:17.763252 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d556130-66df-4a4c-baae-5f7294e0bc35-cert\") pod \"ingress-canary-k9tbm\" (UID: \"3d556130-66df-4a4c-baae-5f7294e0bc35\") " pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:37:17.766364 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:17.763612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls\") pod \"image-registry-66658f9bb6-7qttb\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:37:18.010801 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.010716 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8gvfs\"" Apr 22 18:37:18.018320 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.018288 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:37:18.032409 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.032385 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xfhxc\"" Apr 22 18:37:18.040143 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.040108 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-msgd4" Apr 22 18:37:18.054392 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.054336 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x8r7b\"" Apr 22 18:37:18.061991 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.061952 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k9tbm" Apr 22 18:37:18.169091 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.169061 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66658f9bb6-7qttb"] Apr 22 18:37:18.183280 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:37:18.183247 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a4f3346_013e_4759_8acd_a577393f5c37.slice/crio-0bb7045d8c2ec5fdc19bf593c1c6f38f05377dba0c5e910f1919f5436132ab60 WatchSource:0}: Error finding container 0bb7045d8c2ec5fdc19bf593c1c6f38f05377dba0c5e910f1919f5436132ab60: Status 404 returned error can't find the container with id 0bb7045d8c2ec5fdc19bf593c1c6f38f05377dba0c5e910f1919f5436132ab60 Apr 22 18:37:18.194275 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.194247 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-msgd4"] Apr 22 18:37:18.197396 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:37:18.197371 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e36af8a_7247_492e_8451_b00362b2dcac.slice/crio-dbd67ad7f10521fcbe0edecf83169f45e90aa4438ae2ba49288593cfe5245066 WatchSource:0}: Error finding container dbd67ad7f10521fcbe0edecf83169f45e90aa4438ae2ba49288593cfe5245066: Status 404 returned error can't find the container with id dbd67ad7f10521fcbe0edecf83169f45e90aa4438ae2ba49288593cfe5245066 Apr 22 18:37:18.212992 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.212969 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k9tbm"] Apr 22 18:37:18.218711 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:37:18.218682 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d556130_66df_4a4c_baae_5f7294e0bc35.slice/crio-ba8179c88680e75315b23f44bf693bf37e0d9fe55c62f927e797616341036ea3 WatchSource:0}: Error finding container ba8179c88680e75315b23f44bf693bf37e0d9fe55c62f927e797616341036ea3: Status 404 returned error can't find the container with id ba8179c88680e75315b23f44bf693bf37e0d9fe55c62f927e797616341036ea3 Apr 22 18:37:18.896994 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.896938 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k9tbm" event={"ID":"3d556130-66df-4a4c-baae-5f7294e0bc35","Type":"ContainerStarted","Data":"ba8179c88680e75315b23f44bf693bf37e0d9fe55c62f927e797616341036ea3"} Apr 22 18:37:18.898111 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.898079 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-msgd4" event={"ID":"9e36af8a-7247-492e-8451-b00362b2dcac","Type":"ContainerStarted","Data":"dbd67ad7f10521fcbe0edecf83169f45e90aa4438ae2ba49288593cfe5245066"} Apr 22 18:37:18.899716 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.899618 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" event={"ID":"0a4f3346-013e-4759-8acd-a577393f5c37","Type":"ContainerStarted","Data":"aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b"} Apr 22 18:37:18.899716 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.899671 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" event={"ID":"0a4f3346-013e-4759-8acd-a577393f5c37","Type":"ContainerStarted","Data":"0bb7045d8c2ec5fdc19bf593c1c6f38f05377dba0c5e910f1919f5436132ab60"} Apr 22 18:37:18.899889 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.899770 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:37:18.925089 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:18.925031 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" podStartSLOduration=37.924999187 podStartE2EDuration="37.924999187s" podCreationTimestamp="2026-04-22 18:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:37:18.924765527 +0000 UTC m=+64.950379196" watchObservedRunningTime="2026-04-22 18:37:18.924999187 +0000 UTC m=+64.950612854" Apr 22 18:37:19.174228 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.174137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:37:19.177710 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.177682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26b34a86-850d-4fe7-87a3-10448b9b73a3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngrmj\" (UID: \"26b34a86-850d-4fe7-87a3-10448b9b73a3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:37:19.274592 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.274555 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:37:19.277909 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.277883 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:37:19.288272 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.288248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617c16ac-507f-45a2-ab75-d583c7798ca1-metrics-certs\") pod \"network-metrics-daemon-9t8m2\" (UID: \"617c16ac-507f-45a2-ab75-d583c7798ca1\") " pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:37:19.375763 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.375725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnbzs\" (UniqueName: \"kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs\") pod \"network-check-target-dt6rs\" (UID: \"a97e9a8d-908d-409a-b079-4fee4c52cdcd\") " pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:37:19.378800 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.378778 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:37:19.389320 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.389294 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:37:19.399223 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.399193 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnbzs\" (UniqueName: \"kubernetes.io/projected/a97e9a8d-908d-409a-b079-4fee4c52cdcd-kube-api-access-dnbzs\") pod \"network-check-target-dt6rs\" (UID: \"a97e9a8d-908d-409a-b079-4fee4c52cdcd\") " pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:37:19.421834 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.421809 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zwsxn\"" Apr 22 18:37:19.427557 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.427507 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-95dwn\"" Apr 22 18:37:19.429578 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.429559 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:37:19.435311 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.435292 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9t8m2" Apr 22 18:37:19.446406 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.446385 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-8vwk8\"" Apr 22 18:37:19.453863 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:19.453846 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" Apr 22 18:37:20.428413 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.428376 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt"] Apr 22 18:37:20.452918 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.452887 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-m9jlg"] Apr 22 18:37:20.453089 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.452998 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt" Apr 22 18:37:20.455797 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.455777 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 18:37:20.456558 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.456535 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-rjsbn\"" Apr 22 18:37:20.476823 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.476802 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt"] Apr 22 18:37:20.476823 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.476827 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66658f9bb6-7qttb"] Apr 22 18:37:20.476982 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.476842 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-m9jlg"] Apr 22 18:37:20.476982 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.476938 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-m9jlg" Apr 22 18:37:20.479405 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.479384 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:37:20.479518 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.479417 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:37:20.479518 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.479420 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-cvr4x\"" Apr 22 18:37:20.486206 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.484480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/408f3706-1f36-4dbb-83f7-1fd0bbfec3bb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hxdbt\" (UID: \"408f3706-1f36-4dbb-83f7-1fd0bbfec3bb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt" Apr 22 18:37:20.538325 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.538299 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-btj9l"] Apr 22 18:37:20.554110 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.554084 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b9dcb7fbd-ss8dh"] Apr 22 18:37:20.554265 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.554249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.567439 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.567414 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:37:20.567625 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.567595 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:37:20.567804 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.567786 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7qsn6\"" Apr 22 18:37:20.572007 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.571991 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:37:20.574129 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.574110 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:37:20.585926 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.585898 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b623c724-516c-4927-bf88-44d34adfbf95-crio-socket\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.586030 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.585931 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j7gq\" (UniqueName: \"kubernetes.io/projected/b623c724-516c-4927-bf88-44d34adfbf95-kube-api-access-4j7gq\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.586030 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.585965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b623c724-516c-4927-bf88-44d34adfbf95-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.586114 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.586042 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbxzr\" (UniqueName: \"kubernetes.io/projected/48628ae2-0b18-47ec-a9fe-e7dd053efb08-kube-api-access-pbxzr\") pod \"downloads-6bcc868b7-m9jlg\" (UID: \"48628ae2-0b18-47ec-a9fe-e7dd053efb08\") " pod="openshift-console/downloads-6bcc868b7-m9jlg" Apr 22 18:37:20.586114 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.586067 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b623c724-516c-4927-bf88-44d34adfbf95-data-volume\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.586114 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.586096 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b623c724-516c-4927-bf88-44d34adfbf95-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.586238 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.586126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/408f3706-1f36-4dbb-83f7-1fd0bbfec3bb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hxdbt\" (UID: \"408f3706-1f36-4dbb-83f7-1fd0bbfec3bb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt" Apr 22 18:37:20.588406 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.588387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/408f3706-1f36-4dbb-83f7-1fd0bbfec3bb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hxdbt\" (UID: \"408f3706-1f36-4dbb-83f7-1fd0bbfec3bb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt" Apr 22 18:37:20.589073 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.589057 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-btj9l"] Apr 22 18:37:20.589122 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.589079 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b9dcb7fbd-ss8dh"] Apr 22 18:37:20.589179 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.589169 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.686666 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/519eae9a-0267-4d70-8881-49f12042815a-registry-certificates\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.686666 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686538 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b623c724-516c-4927-bf88-44d34adfbf95-crio-socket\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.686666 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4j7gq\" (UniqueName: \"kubernetes.io/projected/b623c724-516c-4927-bf88-44d34adfbf95-kube-api-access-4j7gq\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.686666 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686608 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/519eae9a-0267-4d70-8881-49f12042815a-ca-trust-extracted\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.686666 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b623c724-516c-4927-bf88-44d34adfbf95-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.687011 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbxzr\" (UniqueName: \"kubernetes.io/projected/48628ae2-0b18-47ec-a9fe-e7dd053efb08-kube-api-access-pbxzr\") pod \"downloads-6bcc868b7-m9jlg\" (UID: \"48628ae2-0b18-47ec-a9fe-e7dd053efb08\") " pod="openshift-console/downloads-6bcc868b7-m9jlg" Apr 22 18:37:20.687011 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/519eae9a-0267-4d70-8881-49f12042815a-image-registry-private-configuration\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.687011 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686776 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b623c724-516c-4927-bf88-44d34adfbf95-data-volume\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.687011 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/519eae9a-0267-4d70-8881-49f12042815a-installation-pull-secrets\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.687011 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/519eae9a-0267-4d70-8881-49f12042815a-registry-tls\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.687011 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686869 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b623c724-516c-4927-bf88-44d34adfbf95-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.687011 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686921 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/519eae9a-0267-4d70-8881-49f12042815a-trusted-ca\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.687011 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/519eae9a-0267-4d70-8881-49f12042815a-bound-sa-token\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.687011 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.686980 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgxvf\" (UniqueName: \"kubernetes.io/projected/519eae9a-0267-4d70-8881-49f12042815a-kube-api-access-sgxvf\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.687468 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.687111 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b623c724-516c-4927-bf88-44d34adfbf95-crio-socket\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.688971 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.688360 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b623c724-516c-4927-bf88-44d34adfbf95-data-volume\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.688971 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.688877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b623c724-516c-4927-bf88-44d34adfbf95-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.693851 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.693059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b623c724-516c-4927-bf88-44d34adfbf95-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.698895 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.698847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbxzr\" (UniqueName: \"kubernetes.io/projected/48628ae2-0b18-47ec-a9fe-e7dd053efb08-kube-api-access-pbxzr\") pod \"downloads-6bcc868b7-m9jlg\" (UID: \"48628ae2-0b18-47ec-a9fe-e7dd053efb08\") " pod="openshift-console/downloads-6bcc868b7-m9jlg" Apr 22 18:37:20.700876 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.700836 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j7gq\" (UniqueName: \"kubernetes.io/projected/b623c724-516c-4927-bf88-44d34adfbf95-kube-api-access-4j7gq\") pod \"insights-runtime-extractor-btj9l\" (UID: \"b623c724-516c-4927-bf88-44d34adfbf95\") " pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.752868 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.752823 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dt6rs"] Apr 22 18:37:20.758848 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:37:20.758821 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda97e9a8d_908d_409a_b079_4fee4c52cdcd.slice/crio-539ea4136817329c45effad94a2af133c9a24011add6b44842b1db9a57a3e670 WatchSource:0}: Error finding container 539ea4136817329c45effad94a2af133c9a24011add6b44842b1db9a57a3e670: Status 404 returned error can't find the container with id 539ea4136817329c45effad94a2af133c9a24011add6b44842b1db9a57a3e670 Apr 22 18:37:20.762525 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.762501 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt" Apr 22 18:37:20.773359 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.773321 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj"] Apr 22 18:37:20.776934 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:37:20.776904 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26b34a86_850d_4fe7_87a3_10448b9b73a3.slice/crio-404113b7a30a0d0cd0941fa5f69bd901a817c1df9adcda9736947d71d6af04bb WatchSource:0}: Error finding container 404113b7a30a0d0cd0941fa5f69bd901a817c1df9adcda9736947d71d6af04bb: Status 404 returned error can't find the container with id 404113b7a30a0d0cd0941fa5f69bd901a817c1df9adcda9736947d71d6af04bb Apr 22 18:37:20.785806 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.785776 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-m9jlg" Apr 22 18:37:20.787643 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.787618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/519eae9a-0267-4d70-8881-49f12042815a-image-registry-private-configuration\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.787736 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.787702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/519eae9a-0267-4d70-8881-49f12042815a-installation-pull-secrets\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.787801 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.787785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/519eae9a-0267-4d70-8881-49f12042815a-registry-tls\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.787888 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.787872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/519eae9a-0267-4d70-8881-49f12042815a-trusted-ca\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.787964 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.787947 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/519eae9a-0267-4d70-8881-49f12042815a-bound-sa-token\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.788015 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.787980 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgxvf\" (UniqueName: \"kubernetes.io/projected/519eae9a-0267-4d70-8881-49f12042815a-kube-api-access-sgxvf\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.788068 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.788041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/519eae9a-0267-4d70-8881-49f12042815a-registry-certificates\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.788217 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.788199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/519eae9a-0267-4d70-8881-49f12042815a-ca-trust-extracted\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.788689 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.788668 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/519eae9a-0267-4d70-8881-49f12042815a-ca-trust-extracted\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.789391 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.789049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/519eae9a-0267-4d70-8881-49f12042815a-registry-certificates\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.789391 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.789105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/519eae9a-0267-4d70-8881-49f12042815a-trusted-ca\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.791538 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.791250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/519eae9a-0267-4d70-8881-49f12042815a-image-registry-private-configuration\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.791898 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.791856 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/519eae9a-0267-4d70-8881-49f12042815a-registry-tls\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.792412 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.792378 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/519eae9a-0267-4d70-8881-49f12042815a-installation-pull-secrets\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.794511 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.794465 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9t8m2"] Apr 22 18:37:20.802840 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.801013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgxvf\" (UniqueName: \"kubernetes.io/projected/519eae9a-0267-4d70-8881-49f12042815a-kube-api-access-sgxvf\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.802840 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.802466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/519eae9a-0267-4d70-8881-49f12042815a-bound-sa-token\") pod \"image-registry-b9dcb7fbd-ss8dh\" (UID: \"519eae9a-0267-4d70-8881-49f12042815a\") " pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.873802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.872985 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-btj9l" Apr 22 18:37:20.914727 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.914528 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:20.918902 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.918849 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dt6rs" event={"ID":"a97e9a8d-908d-409a-b079-4fee4c52cdcd","Type":"ContainerStarted","Data":"539ea4136817329c45effad94a2af133c9a24011add6b44842b1db9a57a3e670"} Apr 22 18:37:20.928973 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.927229 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9t8m2" event={"ID":"617c16ac-507f-45a2-ab75-d583c7798ca1","Type":"ContainerStarted","Data":"7fb096570b20db8abed66667e10a3b7e03adb41f1538045423dbb0cf203a488e"} Apr 22 18:37:20.928973 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.928925 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt"] Apr 22 18:37:20.931728 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.931607 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" event={"ID":"26b34a86-850d-4fe7-87a3-10448b9b73a3","Type":"ContainerStarted","Data":"404113b7a30a0d0cd0941fa5f69bd901a817c1df9adcda9736947d71d6af04bb"} Apr 22 18:37:20.933385 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:37:20.933355 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod408f3706_1f36_4dbb_83f7_1fd0bbfec3bb.slice/crio-c343d2cb65cf30faeef5a51d1c072ad790e91a77f87ed6a6b94d8f56ac186ad9 WatchSource:0}: Error finding container c343d2cb65cf30faeef5a51d1c072ad790e91a77f87ed6a6b94d8f56ac186ad9: Status 404 returned error can't find the container with id c343d2cb65cf30faeef5a51d1c072ad790e91a77f87ed6a6b94d8f56ac186ad9 Apr 22 18:37:20.934363 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.934088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k9tbm" event={"ID":"3d556130-66df-4a4c-baae-5f7294e0bc35","Type":"ContainerStarted","Data":"4d88aa6531927d85fff952477e0df9931c26f7c222fc2d54701e3bf5041d52c7"} Apr 22 18:37:20.938579 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.938541 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-msgd4" event={"ID":"9e36af8a-7247-492e-8451-b00362b2dcac","Type":"ContainerStarted","Data":"91872fb3a802ffb59d80f9369390baa4107a381220e16b662f68af1d1c48bb54"} Apr 22 18:37:20.954726 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.954016 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k9tbm" podStartSLOduration=33.570804259 podStartE2EDuration="35.953999175s" podCreationTimestamp="2026-04-22 18:36:45 +0000 UTC" firstStartedPulling="2026-04-22 18:37:18.220740802 +0000 UTC m=+64.246354449" lastFinishedPulling="2026-04-22 18:37:20.60393572 +0000 UTC m=+66.629549365" observedRunningTime="2026-04-22 18:37:20.953417398 +0000 UTC m=+66.979031079" watchObservedRunningTime="2026-04-22 18:37:20.953999175 +0000 UTC m=+66.979612847" Apr 22 18:37:20.958802 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:20.958070 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-m9jlg"] Apr 22 18:37:20.974439 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:37:20.974340 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48628ae2_0b18_47ec_a9fe_e7dd053efb08.slice/crio-21d40172c59cb0a30f635232ab2559b0ea418c499b8efd2e916577dec6d7e62a WatchSource:0}: Error finding container 21d40172c59cb0a30f635232ab2559b0ea418c499b8efd2e916577dec6d7e62a: Status 404 returned error can't find the container with id 21d40172c59cb0a30f635232ab2559b0ea418c499b8efd2e916577dec6d7e62a Apr 22 18:37:21.090375 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.090340 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-btj9l"] Apr 22 18:37:21.093195 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:37:21.093168 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb623c724_516c_4927_bf88_44d34adfbf95.slice/crio-2d34895e76f63eaac76128a737d2fada8f8dfd2e9993c15eb87d199f91d1ab7b WatchSource:0}: Error finding container 2d34895e76f63eaac76128a737d2fada8f8dfd2e9993c15eb87d199f91d1ab7b: Status 404 returned error can't find the container with id 2d34895e76f63eaac76128a737d2fada8f8dfd2e9993c15eb87d199f91d1ab7b Apr 22 18:37:21.110183 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.110160 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b9dcb7fbd-ss8dh"] Apr 22 18:37:21.114255 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:37:21.114227 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod519eae9a_0267_4d70_8881_49f12042815a.slice/crio-e67f73a690ec8b4fb5e4ee15512b7228f14e3bc6f188911004a39ad16e1bde37 WatchSource:0}: Error finding container e67f73a690ec8b4fb5e4ee15512b7228f14e3bc6f188911004a39ad16e1bde37: Status 404 returned error can't find the container with id e67f73a690ec8b4fb5e4ee15512b7228f14e3bc6f188911004a39ad16e1bde37 Apr 22 18:37:21.945636 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.945566 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" event={"ID":"519eae9a-0267-4d70-8881-49f12042815a","Type":"ContainerStarted","Data":"1e59baa89b6016b228840f76fac03f9cb8e9a0c827ce1fd712e2df4cf62a4cdd"} Apr 22 18:37:21.945636 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.945612 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" event={"ID":"519eae9a-0267-4d70-8881-49f12042815a","Type":"ContainerStarted","Data":"e67f73a690ec8b4fb5e4ee15512b7228f14e3bc6f188911004a39ad16e1bde37"} Apr 22 18:37:21.946240 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.946109 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:21.948106 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.948013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-btj9l" event={"ID":"b623c724-516c-4927-bf88-44d34adfbf95","Type":"ContainerStarted","Data":"91aa5ca85517673ac744013784247333265af3d14714e50658b862519b84d613"} Apr 22 18:37:21.948106 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.948042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-btj9l" event={"ID":"b623c724-516c-4927-bf88-44d34adfbf95","Type":"ContainerStarted","Data":"2d34895e76f63eaac76128a737d2fada8f8dfd2e9993c15eb87d199f91d1ab7b"} Apr 22 18:37:21.950153 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.950105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-m9jlg" event={"ID":"48628ae2-0b18-47ec-a9fe-e7dd053efb08","Type":"ContainerStarted","Data":"21d40172c59cb0a30f635232ab2559b0ea418c499b8efd2e916577dec6d7e62a"} Apr 22 18:37:21.951525 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.951492 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt" event={"ID":"408f3706-1f36-4dbb-83f7-1fd0bbfec3bb","Type":"ContainerStarted","Data":"c343d2cb65cf30faeef5a51d1c072ad790e91a77f87ed6a6b94d8f56ac186ad9"} Apr 22 18:37:21.953952 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.953916 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-msgd4" event={"ID":"9e36af8a-7247-492e-8451-b00362b2dcac","Type":"ContainerStarted","Data":"fc6ebe266e910dd27bf1b54c14cb6aaacc68d3c4143cfadf85761b5f5da7a0d6"} Apr 22 18:37:21.954105 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.954071 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-msgd4" Apr 22 18:37:21.969532 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.969410 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" podStartSLOduration=1.9693915309999999 podStartE2EDuration="1.969391531s" podCreationTimestamp="2026-04-22 18:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:37:21.968927896 +0000 UTC m=+67.994541563" watchObservedRunningTime="2026-04-22 18:37:21.969391531 +0000 UTC m=+67.995005196" Apr 22 18:37:21.990680 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:21.989562 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-msgd4" podStartSLOduration=34.590949302 podStartE2EDuration="36.98954417s" podCreationTimestamp="2026-04-22 18:36:45 +0000 UTC" firstStartedPulling="2026-04-22 18:37:18.199783605 +0000 UTC m=+64.225397254" lastFinishedPulling="2026-04-22 18:37:20.598378419 +0000 UTC m=+66.623992122" observedRunningTime="2026-04-22 18:37:21.989100956 +0000 UTC m=+68.014714638" watchObservedRunningTime="2026-04-22 18:37:21.98954417 +0000 UTC m=+68.015157847" Apr 22 18:37:25.969010 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:25.968970 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-btj9l" event={"ID":"b623c724-516c-4927-bf88-44d34adfbf95","Type":"ContainerStarted","Data":"bc51db69fed4b19d5e306abbc04d62809ef924aedb1ac5d24f1b4e2d3310540d"} Apr 22 18:37:25.970522 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:25.970436 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt" event={"ID":"408f3706-1f36-4dbb-83f7-1fd0bbfec3bb","Type":"ContainerStarted","Data":"397adcd47e3a5b31895afa58b28f516ddb56bbcf1d300194176ff91e68520f84"} Apr 22 18:37:25.970706 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:25.970675 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt" Apr 22 18:37:25.972108 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:25.972081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dt6rs" event={"ID":"a97e9a8d-908d-409a-b079-4fee4c52cdcd","Type":"ContainerStarted","Data":"b951b823b45e59a676f5ec3b90328272d00385b60fcf71db5e882d124c5668da"} Apr 22 18:37:25.972316 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:25.972296 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:37:25.973978 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:25.973955 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9t8m2" event={"ID":"617c16ac-507f-45a2-ab75-d583c7798ca1","Type":"ContainerStarted","Data":"1aeea8454db04bff8c511135493b7208a48ead07c47366067e57036279b4247b"} Apr 22 18:37:25.974115 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:25.973985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9t8m2" event={"ID":"617c16ac-507f-45a2-ab75-d583c7798ca1","Type":"ContainerStarted","Data":"272c0fa990ac050756c86cf634b5528c9cc70f11a97b3813ed4a9d482751bb32"} Apr 22 18:37:25.976168 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:25.976133 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" event={"ID":"26b34a86-850d-4fe7-87a3-10448b9b73a3","Type":"ContainerStarted","Data":"60835c00c65e0ba09e2fa6b923c122a32b02d38c3e42a4d27736870c0a406d31"} Apr 22 18:37:25.976402 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:25.976383 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt" Apr 22 18:37:25.991921 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:25.991871 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxdbt" podStartSLOduration=2.029785037 podStartE2EDuration="5.991857232s" podCreationTimestamp="2026-04-22 18:37:20 +0000 UTC" firstStartedPulling="2026-04-22 18:37:20.937789144 +0000 UTC m=+66.963402806" lastFinishedPulling="2026-04-22 18:37:24.899861352 +0000 UTC m=+70.925475001" observedRunningTime="2026-04-22 18:37:25.990415342 +0000 UTC m=+72.016029012" watchObservedRunningTime="2026-04-22 18:37:25.991857232 +0000 UTC m=+72.017470928" Apr 22 18:37:26.008407 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:26.008355 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngrmj" podStartSLOduration=34.892571083 podStartE2EDuration="39.00833616s" podCreationTimestamp="2026-04-22 18:36:47 +0000 UTC" firstStartedPulling="2026-04-22 18:37:20.779332781 +0000 UTC m=+66.804946425" lastFinishedPulling="2026-04-22 18:37:24.895097849 +0000 UTC m=+70.920711502" observedRunningTime="2026-04-22 18:37:26.007609357 +0000 UTC m=+72.033223025" watchObservedRunningTime="2026-04-22 18:37:26.00833616 +0000 UTC m=+72.033949827" Apr 22 18:37:26.052880 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:26.052821 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9t8m2" podStartSLOduration=67.963201296 podStartE2EDuration="1m12.052798149s" podCreationTimestamp="2026-04-22 18:36:14 +0000 UTC" firstStartedPulling="2026-04-22 18:37:20.803311488 +0000 UTC m=+66.828925133" lastFinishedPulling="2026-04-22 18:37:24.892908333 +0000 UTC m=+70.918521986" observedRunningTime="2026-04-22 18:37:26.050070617 +0000 UTC m=+72.075684288" watchObservedRunningTime="2026-04-22 18:37:26.052798149 +0000 UTC m=+72.078411816" Apr 22 18:37:26.980943 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:26.980886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-btj9l" event={"ID":"b623c724-516c-4927-bf88-44d34adfbf95","Type":"ContainerStarted","Data":"0f710a441ba7be2c19233c08cc182d60c15dd82c8a6bf3a0630cc3a516f7e756"} Apr 22 18:37:27.013157 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:27.013092 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-btj9l" podStartSLOduration=1.7881198619999998 podStartE2EDuration="7.013073328s" podCreationTimestamp="2026-04-22 18:37:20 +0000 UTC" firstStartedPulling="2026-04-22 18:37:21.216537025 +0000 UTC m=+67.242150683" lastFinishedPulling="2026-04-22 18:37:26.441490501 +0000 UTC m=+72.467104149" observedRunningTime="2026-04-22 18:37:27.011280011 +0000 UTC m=+73.036893679" watchObservedRunningTime="2026-04-22 18:37:27.013073328 +0000 UTC m=+73.038686995" Apr 22 18:37:27.014023 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:27.013984 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dt6rs" podStartSLOduration=68.820614437 podStartE2EDuration="1m13.013974627s" podCreationTimestamp="2026-04-22 18:36:14 +0000 UTC" firstStartedPulling="2026-04-22 18:37:20.761047564 +0000 UTC m=+66.786661224" lastFinishedPulling="2026-04-22 18:37:24.954407767 +0000 UTC m=+70.980021414" observedRunningTime="2026-04-22 18:37:26.077645203 +0000 UTC m=+72.103258870" watchObservedRunningTime="2026-04-22 18:37:27.013974627 +0000 UTC m=+73.039588295" Apr 22 18:37:31.959386 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:31.959348 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-msgd4" Apr 22 18:37:36.137824 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.137781 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-k4rxg"] Apr 22 18:37:36.141282 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.141194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.145560 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.145529 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:37:36.145560 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.145540 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:37:36.145773 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.145598 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:37:36.147099 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.146833 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:37:36.147099 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.146853 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6pfl9\"" Apr 22 18:37:36.147099 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.146833 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:37:36.147099 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.146884 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:37:36.218758 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.218701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-tls\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.218758 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.218757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c28ba8d-9997-484f-aee0-0eea8d4cad96-metrics-client-ca\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.219017 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.218807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-accelerators-collector-config\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.219017 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.218841 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c28ba8d-9997-484f-aee0-0eea8d4cad96-root\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.219017 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.218866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c28ba8d-9997-484f-aee0-0eea8d4cad96-sys\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.219017 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.218890 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.219017 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.218936 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-wtmp\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.219017 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.218971 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7r8v\" (UniqueName: \"kubernetes.io/projected/7c28ba8d-9997-484f-aee0-0eea8d4cad96-kube-api-access-b7r8v\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.219017 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.219007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-textfile\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320018 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.319976 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-wtmp\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320018 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7r8v\" (UniqueName: \"kubernetes.io/projected/7c28ba8d-9997-484f-aee0-0eea8d4cad96-kube-api-access-b7r8v\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320258 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-textfile\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320258 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-tls\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320258 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c28ba8d-9997-484f-aee0-0eea8d4cad96-metrics-client-ca\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320258 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320191 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-wtmp\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320258 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-accelerators-collector-config\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320504 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c28ba8d-9997-484f-aee0-0eea8d4cad96-root\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320504 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320289 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c28ba8d-9997-484f-aee0-0eea8d4cad96-sys\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320504 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320326 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320665 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320532 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c28ba8d-9997-484f-aee0-0eea8d4cad96-root\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320665 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c28ba8d-9997-484f-aee0-0eea8d4cad96-sys\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.320773 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:36.320757 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:37:36.320827 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:36.320820 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-tls podName:7c28ba8d-9997-484f-aee0-0eea8d4cad96 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:36.820799246 +0000 UTC m=+82.846412894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-tls") pod "node-exporter-k4rxg" (UID: "7c28ba8d-9997-484f-aee0-0eea8d4cad96") : secret "node-exporter-tls" not found Apr 22 18:37:36.320888 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.320833 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-accelerators-collector-config\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.321106 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.321085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-textfile\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.321461 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.321433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c28ba8d-9997-484f-aee0-0eea8d4cad96-metrics-client-ca\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.323382 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.323358 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.339200 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.339172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7r8v\" (UniqueName: \"kubernetes.io/projected/7c28ba8d-9997-484f-aee0-0eea8d4cad96-kube-api-access-b7r8v\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.825058 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:36.825013 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-tls\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:36.825247 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:36.825203 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:37:36.825326 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:36.825286 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-tls podName:7c28ba8d-9997-484f-aee0-0eea8d4cad96 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:37.825262151 +0000 UTC m=+83.850875796 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-tls") pod "node-exporter-k4rxg" (UID: "7c28ba8d-9997-484f-aee0-0eea8d4cad96") : secret "node-exporter-tls" not found Apr 22 18:37:37.833349 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:37.833308 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-tls\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:37.835961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:37.835936 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c28ba8d-9997-484f-aee0-0eea8d4cad96-node-exporter-tls\") pod \"node-exporter-k4rxg\" (UID: \"7c28ba8d-9997-484f-aee0-0eea8d4cad96\") " pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:37.953557 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:37.953516 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k4rxg" Apr 22 18:37:39.468013 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:37:39.467979 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c28ba8d_9997_484f_aee0_0eea8d4cad96.slice/crio-76ac48d3b7762596997c2519ce31bf663697d867d687f03d5ffadc9334dfd11d WatchSource:0}: Error finding container 76ac48d3b7762596997c2519ce31bf663697d867d687f03d5ffadc9334dfd11d: Status 404 returned error can't find the container with id 76ac48d3b7762596997c2519ce31bf663697d867d687f03d5ffadc9334dfd11d Apr 22 18:37:40.020171 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:40.020130 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4rxg" event={"ID":"7c28ba8d-9997-484f-aee0-0eea8d4cad96","Type":"ContainerStarted","Data":"76ac48d3b7762596997c2519ce31bf663697d867d687f03d5ffadc9334dfd11d"} Apr 22 18:37:40.022093 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:40.022056 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-m9jlg" event={"ID":"48628ae2-0b18-47ec-a9fe-e7dd053efb08","Type":"ContainerStarted","Data":"1f5ed07d155526cd173bcaa6c253e77589bed88a23bfc01fce6023062c0d13ba"} Apr 22 18:37:40.022433 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:40.022406 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-m9jlg" Apr 22 18:37:40.046505 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:40.046450 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-m9jlg" podStartSLOduration=1.497209551 podStartE2EDuration="20.046432356s" podCreationTimestamp="2026-04-22 18:37:20 +0000 UTC" firstStartedPulling="2026-04-22 18:37:20.981140533 +0000 UTC m=+67.006754192" lastFinishedPulling="2026-04-22 18:37:39.530363352 +0000 UTC m=+85.555976997" observedRunningTime="2026-04-22 18:37:40.044112699 +0000 UTC m=+86.069726367" watchObservedRunningTime="2026-04-22 18:37:40.046432356 +0000 UTC m=+86.072046027" Apr 22 18:37:40.046692 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:40.046627 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-m9jlg" Apr 22 18:37:40.945896 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:40.945862 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:37:41.027324 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:41.027278 2577 generic.go:358] "Generic (PLEG): container finished" podID="7c28ba8d-9997-484f-aee0-0eea8d4cad96" containerID="9883792d020c5f6fcdb95955b9602945c0c0f07095a016e25fb5617b09d29c66" exitCode=0 Apr 22 18:37:41.027514 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:41.027376 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4rxg" event={"ID":"7c28ba8d-9997-484f-aee0-0eea8d4cad96","Type":"ContainerDied","Data":"9883792d020c5f6fcdb95955b9602945c0c0f07095a016e25fb5617b09d29c66"} Apr 22 18:37:42.032447 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.032405 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4rxg" event={"ID":"7c28ba8d-9997-484f-aee0-0eea8d4cad96","Type":"ContainerStarted","Data":"0636742defb9d31b68828fec4eb040471aec9f462c4bca66ddd2e30bb8965c88"} Apr 22 18:37:42.032447 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.032451 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4rxg" event={"ID":"7c28ba8d-9997-484f-aee0-0eea8d4cad96","Type":"ContainerStarted","Data":"0e4a4e8f720ba37f576c0ed471446c897748e6bcb999dc13b02657d6209a8fe8"} Apr 22 18:37:42.057513 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.057448 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-k4rxg" podStartSLOduration=5.198567308 podStartE2EDuration="6.05743446s" podCreationTimestamp="2026-04-22 18:37:36 +0000 UTC" firstStartedPulling="2026-04-22 18:37:39.469928096 +0000 UTC m=+85.495541743" lastFinishedPulling="2026-04-22 18:37:40.328795225 +0000 UTC m=+86.354408895" observedRunningTime="2026-04-22 18:37:42.0563709 +0000 UTC m=+88.081984603" watchObservedRunningTime="2026-04-22 18:37:42.05743446 +0000 UTC m=+88.083048127" Apr 22 18:37:42.401309 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.401217 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:37:42.438347 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.438312 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:37:42.438524 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.438430 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.441530 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.441503 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:37:42.441703 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.441530 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:37:42.441703 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.441546 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:37:42.441915 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.441886 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:37:42.441915 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.441890 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:37:42.442034 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.441930 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:37:42.442034 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.441965 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mvrrd\"" Apr 22 18:37:42.442312 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.442142 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:37:42.442832 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.442814 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:37:42.443010 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.442885 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:37:42.443070 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.443016 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:37:42.443070 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.442923 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-v7f03pn3o9e3\"" Apr 22 18:37:42.443296 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.443271 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:37:42.444032 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.443892 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:37:42.444413 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.444391 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:37:42.574561 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574520 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.574561 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd45a847-7032-495b-b51e-b9e0b7a5ae87-config-out\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.574813 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd45a847-7032-495b-b51e-b9e0b7a5ae87-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.574813 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.574813 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574733 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-web-config\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.574813 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574756 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.574813 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574790 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.575054 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574821 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-config\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.575054 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574844 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.575054 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.575054 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574900 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.575054 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.574983 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.575054 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.575030 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.575260 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.575054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.575260 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.575086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.575260 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.575126 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.575260 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.575151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cd45a847-7032-495b-b51e-b9e0b7a5ae87-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.575260 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.575175 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtnpt\" (UniqueName: \"kubernetes.io/projected/cd45a847-7032-495b-b51e-b9e0b7a5ae87-kube-api-access-rtnpt\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676270 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676270 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676270 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676546 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676315 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676603 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676556 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676673 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676722 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676779 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676779 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cd45a847-7032-495b-b51e-b9e0b7a5ae87-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676892 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676804 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtnpt\" (UniqueName: \"kubernetes.io/projected/cd45a847-7032-495b-b51e-b9e0b7a5ae87-kube-api-access-rtnpt\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676892 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.676892 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd45a847-7032-495b-b51e-b9e0b7a5ae87-config-out\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.677038 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd45a847-7032-495b-b51e-b9e0b7a5ae87-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.677038 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.676976 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.677038 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.677008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-web-config\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.677038 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.677033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.677244 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.677072 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.677244 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.677101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-config\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.677244 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.677126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.677401 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.677268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.677799 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.677771 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cd45a847-7032-495b-b51e-b9e0b7a5ae87-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.680740 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.679247 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.680740 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.680242 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.680903 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.680765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd45a847-7032-495b-b51e-b9e0b7a5ae87-config-out\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.680903 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.680851 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.681311 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.681279 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.681819 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.681797 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd45a847-7032-495b-b51e-b9e0b7a5ae87-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.681961 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.681916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-config\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.682223 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.682200 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.682359 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.682219 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-web-config\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.682775 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.682736 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.682775 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.682762 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.682917 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.682829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.683271 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.683247 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cd45a847-7032-495b-b51e-b9e0b7a5ae87-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.691215 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.691194 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cd45a847-7032-495b-b51e-b9e0b7a5ae87-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.691673 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.691625 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtnpt\" (UniqueName: \"kubernetes.io/projected/cd45a847-7032-495b-b51e-b9e0b7a5ae87-kube-api-access-rtnpt\") pod \"prometheus-k8s-0\" (UID: \"cd45a847-7032-495b-b51e-b9e0b7a5ae87\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.749630 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.749577 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:42.892633 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.892603 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:37:42.894168 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:37:42.894142 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd45a847_7032_495b_b51e_b9e0b7a5ae87.slice/crio-290d0b8218d97cb68624f42c0dbeb803fa555e72168b8ef297a99aafa4626220 WatchSource:0}: Error finding container 290d0b8218d97cb68624f42c0dbeb803fa555e72168b8ef297a99aafa4626220: Status 404 returned error can't find the container with id 290d0b8218d97cb68624f42c0dbeb803fa555e72168b8ef297a99aafa4626220 Apr 22 18:37:42.962008 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:42.961980 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-b9dcb7fbd-ss8dh" Apr 22 18:37:43.036727 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:43.036686 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd45a847-7032-495b-b51e-b9e0b7a5ae87","Type":"ContainerStarted","Data":"290d0b8218d97cb68624f42c0dbeb803fa555e72168b8ef297a99aafa4626220"} Apr 22 18:37:45.044140 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:45.044103 2577 generic.go:358] "Generic (PLEG): container finished" podID="cd45a847-7032-495b-b51e-b9e0b7a5ae87" containerID="0c604724d680569551d14833a1f6f4734a6504b3d8c04c9da6c0bdd3575e5246" exitCode=0 Apr 22 18:37:45.044564 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:45.044204 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd45a847-7032-495b-b51e-b9e0b7a5ae87","Type":"ContainerDied","Data":"0c604724d680569551d14833a1f6f4734a6504b3d8c04c9da6c0bdd3575e5246"} Apr 22 18:37:45.973615 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:45.973390 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" podUID="0a4f3346-013e-4759-8acd-a577393f5c37" containerName="registry" containerID="cri-o://aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b" gracePeriod=30 Apr 22 18:37:46.246855 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.246827 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:37:46.408917 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.408883 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-registry-certificates\") pod \"0a4f3346-013e-4759-8acd-a577393f5c37\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " Apr 22 18:37:46.409171 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.408932 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f9h9\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-kube-api-access-7f9h9\") pod \"0a4f3346-013e-4759-8acd-a577393f5c37\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " Apr 22 18:37:46.409171 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.408962 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-bound-sa-token\") pod \"0a4f3346-013e-4759-8acd-a577393f5c37\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " Apr 22 18:37:46.409171 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.408989 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-trusted-ca\") pod \"0a4f3346-013e-4759-8acd-a577393f5c37\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " Apr 22 18:37:46.409171 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.409019 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a4f3346-013e-4759-8acd-a577393f5c37-ca-trust-extracted\") pod \"0a4f3346-013e-4759-8acd-a577393f5c37\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " Apr 22 18:37:46.409171 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.409078 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls\") pod \"0a4f3346-013e-4759-8acd-a577393f5c37\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " Apr 22 18:37:46.409171 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.409109 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-image-registry-private-configuration\") pod \"0a4f3346-013e-4759-8acd-a577393f5c37\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " Apr 22 18:37:46.409171 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.409138 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-installation-pull-secrets\") pod \"0a4f3346-013e-4759-8acd-a577393f5c37\" (UID: \"0a4f3346-013e-4759-8acd-a577393f5c37\") " Apr 22 18:37:46.410866 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.410156 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0a4f3346-013e-4759-8acd-a577393f5c37" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:37:46.410866 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.410597 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0a4f3346-013e-4759-8acd-a577393f5c37" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:37:46.413018 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.412949 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0a4f3346-013e-4759-8acd-a577393f5c37" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:37:46.413018 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.412981 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0a4f3346-013e-4759-8acd-a577393f5c37" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:37:46.413408 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.413151 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0a4f3346-013e-4759-8acd-a577393f5c37" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:37:46.414085 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.414038 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-kube-api-access-7f9h9" (OuterVolumeSpecName: "kube-api-access-7f9h9") pod "0a4f3346-013e-4759-8acd-a577393f5c37" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37"). InnerVolumeSpecName "kube-api-access-7f9h9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:37:46.415685 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.415632 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0a4f3346-013e-4759-8acd-a577393f5c37" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:37:46.421742 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.421713 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a4f3346-013e-4759-8acd-a577393f5c37-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0a4f3346-013e-4759-8acd-a577393f5c37" (UID: "0a4f3346-013e-4759-8acd-a577393f5c37"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:37:46.510777 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.510584 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7f9h9\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-kube-api-access-7f9h9\") on node \"ip-10-0-132-204.ec2.internal\" DevicePath \"\"" Apr 22 18:37:46.510777 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.510624 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-bound-sa-token\") on node \"ip-10-0-132-204.ec2.internal\" DevicePath \"\"" Apr 22 18:37:46.510777 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.510672 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-trusted-ca\") on node \"ip-10-0-132-204.ec2.internal\" DevicePath \"\"" Apr 22 18:37:46.511052 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.510689 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a4f3346-013e-4759-8acd-a577393f5c37-ca-trust-extracted\") on node \"ip-10-0-132-204.ec2.internal\" DevicePath \"\"" Apr 22 18:37:46.511052 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.510809 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a4f3346-013e-4759-8acd-a577393f5c37-registry-tls\") on node \"ip-10-0-132-204.ec2.internal\" DevicePath \"\"" Apr 22 18:37:46.511052 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.510826 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-image-registry-private-configuration\") on node \"ip-10-0-132-204.ec2.internal\" DevicePath \"\"" Apr 22 18:37:46.511052 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.510838 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a4f3346-013e-4759-8acd-a577393f5c37-installation-pull-secrets\") on node \"ip-10-0-132-204.ec2.internal\" DevicePath \"\"" Apr 22 18:37:46.511052 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:46.510851 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a4f3346-013e-4759-8acd-a577393f5c37-registry-certificates\") on node \"ip-10-0-132-204.ec2.internal\" DevicePath \"\"" Apr 22 18:37:47.053460 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:47.053418 2577 generic.go:358] "Generic (PLEG): container finished" podID="0a4f3346-013e-4759-8acd-a577393f5c37" containerID="aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b" exitCode=0 Apr 22 18:37:47.053646 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:47.053502 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" event={"ID":"0a4f3346-013e-4759-8acd-a577393f5c37","Type":"ContainerDied","Data":"aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b"} Apr 22 18:37:47.053646 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:47.053536 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" event={"ID":"0a4f3346-013e-4759-8acd-a577393f5c37","Type":"ContainerDied","Data":"0bb7045d8c2ec5fdc19bf593c1c6f38f05377dba0c5e910f1919f5436132ab60"} Apr 22 18:37:47.053646 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:47.053574 2577 scope.go:117] "RemoveContainer" containerID="aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b" Apr 22 18:37:47.053852 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:47.053791 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66658f9bb6-7qttb" Apr 22 18:37:47.064093 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:47.064071 2577 scope.go:117] "RemoveContainer" containerID="aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b" Apr 22 18:37:47.064415 ip-10-0-132-204 kubenswrapper[2577]: E0422 18:37:47.064388 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b\": container with ID starting with aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b not found: ID does not exist" containerID="aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b" Apr 22 18:37:47.064511 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:47.064428 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b"} err="failed to get container status \"aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b\": rpc error: code = NotFound desc = could not find container \"aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b\": container with ID starting with aed50ef8ed5f6139354924245255c224f4c94b661bfdf7e3369cede748ab988b not found: ID does not exist" Apr 22 18:37:47.090588 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:47.090538 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66658f9bb6-7qttb"] Apr 22 18:37:47.097063 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:47.097015 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66658f9bb6-7qttb"] Apr 22 18:37:48.603057 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:48.603025 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a4f3346-013e-4759-8acd-a577393f5c37" path="/var/lib/kubelet/pods/0a4f3346-013e-4759-8acd-a577393f5c37/volumes" Apr 22 18:37:49.064829 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:49.064308 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd45a847-7032-495b-b51e-b9e0b7a5ae87","Type":"ContainerStarted","Data":"0052bd378681a8029e5bab90edcf937384c357db4db626961d8830c3a20cf2a2"} Apr 22 18:37:49.064829 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:49.064359 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd45a847-7032-495b-b51e-b9e0b7a5ae87","Type":"ContainerStarted","Data":"d21b118b8215cc575a1e65060485ec69d52a0db6b8261ee1546e147bcca42eed"} Apr 22 18:37:52.078262 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:52.077033 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd45a847-7032-495b-b51e-b9e0b7a5ae87","Type":"ContainerStarted","Data":"a2a91d94a87dfb81e1db8fa0f874773f8ee781d08e4d99d7ed0462bd072d12ba"} Apr 22 18:37:52.078262 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:52.077078 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd45a847-7032-495b-b51e-b9e0b7a5ae87","Type":"ContainerStarted","Data":"76ff22bf966f50a2d682f494e517fceabe985257634f1b1373102d4a79aef7f1"} Apr 22 18:37:52.078262 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:52.077091 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd45a847-7032-495b-b51e-b9e0b7a5ae87","Type":"ContainerStarted","Data":"71e00927f070d6fee265140050ff2a709b51e223a55e0018f69cef168c12657a"} Apr 22 18:37:52.078262 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:52.077103 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd45a847-7032-495b-b51e-b9e0b7a5ae87","Type":"ContainerStarted","Data":"2eca80e65b4970df84158fad8a0cd0a3a41c91beaef426c1a4c59b54a8958775"} Apr 22 18:37:52.113057 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:52.113000 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.7536732769999999 podStartE2EDuration="10.11297471s" podCreationTimestamp="2026-04-22 18:37:42 +0000 UTC" firstStartedPulling="2026-04-22 18:37:42.896573503 +0000 UTC m=+88.922187147" lastFinishedPulling="2026-04-22 18:37:51.255874918 +0000 UTC m=+97.281488580" observedRunningTime="2026-04-22 18:37:52.110348153 +0000 UTC m=+98.135961831" watchObservedRunningTime="2026-04-22 18:37:52.11297471 +0000 UTC m=+98.138588377" Apr 22 18:37:52.750230 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:52.750185 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:37:56.983526 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:37:56.983491 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dt6rs" Apr 22 18:38:42.750289 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:38:42.750249 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:42.769146 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:38:42.769122 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:43.234301 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:38:43.234275 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:41:14.481211 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:41:14.481183 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 18:41:14.482104 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:41:14.482083 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 18:41:14.487776 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:41:14.487752 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:42:52.351229 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.351196 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-97mxz"] Apr 22 18:42:52.351620 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.351559 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a4f3346-013e-4759-8acd-a577393f5c37" containerName="registry" Apr 22 18:42:52.351620 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.351572 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4f3346-013e-4759-8acd-a577393f5c37" containerName="registry" Apr 22 18:42:52.351707 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.351665 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a4f3346-013e-4759-8acd-a577393f5c37" containerName="registry" Apr 22 18:42:52.355010 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.354988 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-97mxz" Apr 22 18:42:52.357808 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.357784 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:42:52.357909 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.357791 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:42:52.358922 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.358903 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:42:52.359019 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.358922 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wvfdf\"" Apr 22 18:42:52.383795 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.383734 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-97mxz"] Apr 22 18:42:52.416194 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.416162 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/25b6955b-2295-487f-9600-8e972c3be106-data\") pod \"seaweedfs-86cc847c5c-97mxz\" (UID: \"25b6955b-2295-487f-9600-8e972c3be106\") " pod="kserve/seaweedfs-86cc847c5c-97mxz" Apr 22 18:42:52.416370 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.416222 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppcxf\" (UniqueName: \"kubernetes.io/projected/25b6955b-2295-487f-9600-8e972c3be106-kube-api-access-ppcxf\") pod \"seaweedfs-86cc847c5c-97mxz\" (UID: \"25b6955b-2295-487f-9600-8e972c3be106\") " pod="kserve/seaweedfs-86cc847c5c-97mxz" Apr 22 18:42:52.516665 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.516608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/25b6955b-2295-487f-9600-8e972c3be106-data\") pod \"seaweedfs-86cc847c5c-97mxz\" (UID: \"25b6955b-2295-487f-9600-8e972c3be106\") " pod="kserve/seaweedfs-86cc847c5c-97mxz" Apr 22 18:42:52.516832 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.516712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppcxf\" (UniqueName: \"kubernetes.io/projected/25b6955b-2295-487f-9600-8e972c3be106-kube-api-access-ppcxf\") pod \"seaweedfs-86cc847c5c-97mxz\" (UID: \"25b6955b-2295-487f-9600-8e972c3be106\") " pod="kserve/seaweedfs-86cc847c5c-97mxz" Apr 22 18:42:52.516997 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.516976 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/25b6955b-2295-487f-9600-8e972c3be106-data\") pod \"seaweedfs-86cc847c5c-97mxz\" (UID: \"25b6955b-2295-487f-9600-8e972c3be106\") " pod="kserve/seaweedfs-86cc847c5c-97mxz" Apr 22 18:42:52.527954 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.527922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppcxf\" (UniqueName: \"kubernetes.io/projected/25b6955b-2295-487f-9600-8e972c3be106-kube-api-access-ppcxf\") pod \"seaweedfs-86cc847c5c-97mxz\" (UID: \"25b6955b-2295-487f-9600-8e972c3be106\") " pod="kserve/seaweedfs-86cc847c5c-97mxz" Apr 22 18:42:52.664554 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.664474 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-97mxz" Apr 22 18:42:52.785610 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.785585 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-97mxz"] Apr 22 18:42:52.788081 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:42:52.788043 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25b6955b_2295_487f_9600_8e972c3be106.slice/crio-635003e88168620653858a818bc2676844c76f37d4ed5d589e7959ac3308d3a4 WatchSource:0}: Error finding container 635003e88168620653858a818bc2676844c76f37d4ed5d589e7959ac3308d3a4: Status 404 returned error can't find the container with id 635003e88168620653858a818bc2676844c76f37d4ed5d589e7959ac3308d3a4 Apr 22 18:42:52.789267 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.789245 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:42:52.863440 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:52.863404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-97mxz" event={"ID":"25b6955b-2295-487f-9600-8e972c3be106","Type":"ContainerStarted","Data":"635003e88168620653858a818bc2676844c76f37d4ed5d589e7959ac3308d3a4"} Apr 22 18:42:55.874771 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:55.874739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-97mxz" event={"ID":"25b6955b-2295-487f-9600-8e972c3be106","Type":"ContainerStarted","Data":"87719ad1c2bee401a1ffb43f10cd8b9d4976a613a6331c1fa7283d0b710b9396"} Apr 22 18:42:55.875160 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:55.874795 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-97mxz" Apr 22 18:42:55.892881 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:42:55.892832 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-97mxz" podStartSLOduration=0.896983421 podStartE2EDuration="3.892817429s" podCreationTimestamp="2026-04-22 18:42:52 +0000 UTC" firstStartedPulling="2026-04-22 18:42:52.789368902 +0000 UTC m=+398.814982549" lastFinishedPulling="2026-04-22 18:42:55.785202911 +0000 UTC m=+401.810816557" observedRunningTime="2026-04-22 18:42:55.892351824 +0000 UTC m=+401.917965490" watchObservedRunningTime="2026-04-22 18:42:55.892817429 +0000 UTC m=+401.918431115" Apr 22 18:43:01.880166 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:43:01.880136 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-97mxz" Apr 22 18:44:19.310772 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:19.310737 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-h2t5j"] Apr 22 18:44:19.312730 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:19.312713 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h2t5j" Apr 22 18:44:19.320218 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:19.320195 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-h2t5j"] Apr 22 18:44:19.477521 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:19.477471 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49tc5\" (UniqueName: \"kubernetes.io/projected/f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95-kube-api-access-49tc5\") pod \"s3-init-h2t5j\" (UID: \"f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95\") " pod="kserve/s3-init-h2t5j" Apr 22 18:44:19.578445 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:19.578365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49tc5\" (UniqueName: \"kubernetes.io/projected/f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95-kube-api-access-49tc5\") pod \"s3-init-h2t5j\" (UID: \"f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95\") " pod="kserve/s3-init-h2t5j" Apr 22 18:44:19.589359 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:19.589331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49tc5\" (UniqueName: \"kubernetes.io/projected/f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95-kube-api-access-49tc5\") pod \"s3-init-h2t5j\" (UID: \"f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95\") " pod="kserve/s3-init-h2t5j" Apr 22 18:44:19.632298 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:19.632263 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h2t5j" Apr 22 18:44:19.747235 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:19.747200 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-h2t5j"] Apr 22 18:44:19.750105 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:44:19.750073 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f5ea8e_b41a_47a6_ab1c_c19ecadedc95.slice/crio-290f61e19116326e2ed84a18ed84c4488cff65fcc58ef349f6a3d0f45bebc816 WatchSource:0}: Error finding container 290f61e19116326e2ed84a18ed84c4488cff65fcc58ef349f6a3d0f45bebc816: Status 404 returned error can't find the container with id 290f61e19116326e2ed84a18ed84c4488cff65fcc58ef349f6a3d0f45bebc816 Apr 22 18:44:20.086763 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:20.086730 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h2t5j" event={"ID":"f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95","Type":"ContainerStarted","Data":"290f61e19116326e2ed84a18ed84c4488cff65fcc58ef349f6a3d0f45bebc816"} Apr 22 18:44:25.102635 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:25.102597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h2t5j" event={"ID":"f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95","Type":"ContainerStarted","Data":"13fc53591620e5ece7abf56a69697ddb2e278f52615204b8db209e498f7c5dab"} Apr 22 18:44:25.119627 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:25.119563 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-h2t5j" podStartSLOduration=1.471392608 podStartE2EDuration="6.119543293s" podCreationTimestamp="2026-04-22 18:44:19 +0000 UTC" firstStartedPulling="2026-04-22 18:44:19.751723142 +0000 UTC m=+485.777336787" lastFinishedPulling="2026-04-22 18:44:24.399873827 +0000 UTC m=+490.425487472" observedRunningTime="2026-04-22 18:44:25.118924076 +0000 UTC m=+491.144537743" watchObservedRunningTime="2026-04-22 18:44:25.119543293 +0000 UTC m=+491.145156960" Apr 22 18:44:28.114587 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:28.114550 2577 generic.go:358] "Generic (PLEG): container finished" podID="f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95" containerID="13fc53591620e5ece7abf56a69697ddb2e278f52615204b8db209e498f7c5dab" exitCode=0 Apr 22 18:44:28.115051 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:28.114613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h2t5j" event={"ID":"f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95","Type":"ContainerDied","Data":"13fc53591620e5ece7abf56a69697ddb2e278f52615204b8db209e498f7c5dab"} Apr 22 18:44:29.233598 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:29.233570 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h2t5j" Apr 22 18:44:29.363857 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:29.363823 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49tc5\" (UniqueName: \"kubernetes.io/projected/f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95-kube-api-access-49tc5\") pod \"f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95\" (UID: \"f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95\") " Apr 22 18:44:29.365935 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:29.365909 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95-kube-api-access-49tc5" (OuterVolumeSpecName: "kube-api-access-49tc5") pod "f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95" (UID: "f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95"). InnerVolumeSpecName "kube-api-access-49tc5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:44:29.465092 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:29.465053 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49tc5\" (UniqueName: \"kubernetes.io/projected/f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95-kube-api-access-49tc5\") on node \"ip-10-0-132-204.ec2.internal\" DevicePath \"\"" Apr 22 18:44:30.120725 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:30.120685 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-h2t5j" event={"ID":"f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95","Type":"ContainerDied","Data":"290f61e19116326e2ed84a18ed84c4488cff65fcc58ef349f6a3d0f45bebc816"} Apr 22 18:44:30.120725 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:30.120709 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-h2t5j" Apr 22 18:44:30.120725 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:30.120720 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="290f61e19116326e2ed84a18ed84c4488cff65fcc58ef349f6a3d0f45bebc816" Apr 22 18:44:39.159145 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.159110 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df"] Apr 22 18:44:39.159511 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.159502 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95" containerName="s3-init" Apr 22 18:44:39.159550 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.159515 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95" containerName="s3-init" Apr 22 18:44:39.159595 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.159584 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95" containerName="s3-init" Apr 22 18:44:39.162027 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.162011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" Apr 22 18:44:39.164688 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.164668 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5vwnn\"" Apr 22 18:44:39.172360 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.172337 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df"] Apr 22 18:44:39.230312 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.230274 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df\" (UID: \"e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" Apr 22 18:44:39.330897 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.330864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df\" (UID: \"e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" Apr 22 18:44:39.331248 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.331229 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df\" (UID: \"e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" Apr 22 18:44:39.472149 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.472110 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" Apr 22 18:44:39.588879 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:39.588854 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df"] Apr 22 18:44:39.591332 ip-10-0-132-204 kubenswrapper[2577]: W0422 18:44:39.591298 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode931c0d3_7b89_4ce9_9f68_ed2e85b10e2f.slice/crio-82b90c0d31fb0088d314fbc0f24f466f850a96aa8ba89ebda8706028ab25baca WatchSource:0}: Error finding container 82b90c0d31fb0088d314fbc0f24f466f850a96aa8ba89ebda8706028ab25baca: Status 404 returned error can't find the container with id 82b90c0d31fb0088d314fbc0f24f466f850a96aa8ba89ebda8706028ab25baca Apr 22 18:44:40.149265 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:40.149232 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" event={"ID":"e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f","Type":"ContainerStarted","Data":"82b90c0d31fb0088d314fbc0f24f466f850a96aa8ba89ebda8706028ab25baca"} Apr 22 18:44:45.164994 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:45.164954 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" event={"ID":"e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f","Type":"ContainerStarted","Data":"20884f5c1af8977fe15f4622ed9e696ec48013e8eed8112f139475c6f8ed9f42"} Apr 22 18:44:49.179858 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:49.179822 2577 generic.go:358] "Generic (PLEG): container finished" podID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerID="20884f5c1af8977fe15f4622ed9e696ec48013e8eed8112f139475c6f8ed9f42" exitCode=0 Apr 22 18:44:49.180215 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:44:49.179897 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" event={"ID":"e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f","Type":"ContainerDied","Data":"20884f5c1af8977fe15f4622ed9e696ec48013e8eed8112f139475c6f8ed9f42"} Apr 22 18:45:02.224217 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:02.224181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" event={"ID":"e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f","Type":"ContainerStarted","Data":"fee0fc7de7cacc453d1d219931f094ea6f1c1976e95014f8141739da39884ef3"} Apr 22 18:45:05.233550 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:05.233523 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" event={"ID":"e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f","Type":"ContainerStarted","Data":"10431ea8e4595b75a283a5869d65cb243cdacb3d9c5a3f716eeaaee8a6aa9ffb"} Apr 22 18:45:05.233912 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:05.233764 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" Apr 22 18:45:05.235125 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:05.235073 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 22 18:45:05.251878 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:05.251818 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podStartSLOduration=0.669578303 podStartE2EDuration="26.25179802s" podCreationTimestamp="2026-04-22 18:44:39 +0000 UTC" firstStartedPulling="2026-04-22 18:44:39.593102842 +0000 UTC m=+505.618716501" lastFinishedPulling="2026-04-22 18:45:05.175322566 +0000 UTC m=+531.200936218" observedRunningTime="2026-04-22 18:45:05.250779273 +0000 UTC m=+531.276392951" watchObservedRunningTime="2026-04-22 18:45:05.25179802 +0000 UTC m=+531.277411691" Apr 22 18:45:06.237513 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:06.237478 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" Apr 22 18:45:06.237977 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:06.237674 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 22 18:45:06.238569 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:06.238546 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:45:07.240960 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:07.240915 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 22 18:45:07.241430 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:07.241229 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:45:17.241671 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:17.241607 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 22 18:45:17.243451 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:17.243423 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:45:27.241897 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:27.241850 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 22 18:45:27.242374 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:27.242343 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:45:37.241014 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:37.240967 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 22 18:45:37.241565 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:37.241409 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:45:47.241287 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:47.241230 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 22 18:45:47.241821 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:47.241796 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:45:57.240808 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:57.240759 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.19:8080: connect: connection refused" Apr 22 18:45:57.241363 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:45:57.241325 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" podUID="e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:46:07.241839 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:46:07.241805 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" Apr 22 18:46:07.242243 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:46:07.241871 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df" Apr 22 18:46:14.500815 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:46:14.500782 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 18:46:14.501688 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:46:14.501671 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 18:51:14.521628 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:51:14.521592 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 18:51:14.524148 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:51:14.524124 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 18:56:14.540346 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:56:14.540317 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 18:56:14.543188 ip-10-0-132-204 kubenswrapper[2577]: I0422 18:56:14.543166 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:01:14.558816 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:01:14.558713 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:01:14.562885 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:01:14.561984 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:06:14.580472 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:06:14.580353 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:06:14.585280 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:06:14.583759 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:11:14.602023 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:11:14.601920 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:11:14.606022 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:11:14.605677 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:16:14.626179 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:16:14.626065 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:16:14.632288 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:16:14.632265 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:21:14.647016 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:21:14.646921 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:21:14.651125 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:21:14.651109 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:26:14.665331 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:26:14.665200 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:26:14.669565 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:26:14.669546 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:31:14.683887 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:31:14.683789 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:31:14.688785 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:31:14.688767 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:36:14.702144 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:36:14.702120 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:36:14.707541 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:36:14.707522 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:41:14.720041 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:41:14.719936 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:41:14.726256 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:41:14.726234 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:46:14.738168 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:46:14.738067 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:46:14.744766 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:46:14.744741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:51:14.758173 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:51:14.758078 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:51:14.769123 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:51:14.769097 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:56:14.774901 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:56:14.774807 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 19:56:14.787741 ip-10-0-132-204 kubenswrapper[2577]: I0422 19:56:14.787718 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 20:01:14.792850 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:01:14.792743 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 20:01:14.806582 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:01:14.806554 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 20:06:14.810240 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:06:14.810137 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 20:06:14.832308 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:06:14.832286 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 20:07:44.270245 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.270170 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jqggs/must-gather-4plx5"] Apr 22 20:07:44.273160 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.273143 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jqggs/must-gather-4plx5" Apr 22 20:07:44.275734 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.275714 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jqggs\"/\"openshift-service-ca.crt\"" Apr 22 20:07:44.275826 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.275751 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jqggs\"/\"kube-root-ca.crt\"" Apr 22 20:07:44.289891 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.289865 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jqggs/must-gather-4plx5"] Apr 22 20:07:44.344228 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.344198 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13cd4c49-abf7-4ece-89e6-52649bee0184-must-gather-output\") pod \"must-gather-4plx5\" (UID: \"13cd4c49-abf7-4ece-89e6-52649bee0184\") " pod="openshift-must-gather-jqggs/must-gather-4plx5" Apr 22 20:07:44.344228 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.344231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vjbs\" (UniqueName: \"kubernetes.io/projected/13cd4c49-abf7-4ece-89e6-52649bee0184-kube-api-access-2vjbs\") pod \"must-gather-4plx5\" (UID: \"13cd4c49-abf7-4ece-89e6-52649bee0184\") " pod="openshift-must-gather-jqggs/must-gather-4plx5" Apr 22 20:07:44.444938 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.444910 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13cd4c49-abf7-4ece-89e6-52649bee0184-must-gather-output\") pod \"must-gather-4plx5\" (UID: \"13cd4c49-abf7-4ece-89e6-52649bee0184\") " pod="openshift-must-gather-jqggs/must-gather-4plx5" Apr 22 20:07:44.444938 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.444942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vjbs\" (UniqueName: \"kubernetes.io/projected/13cd4c49-abf7-4ece-89e6-52649bee0184-kube-api-access-2vjbs\") pod \"must-gather-4plx5\" (UID: \"13cd4c49-abf7-4ece-89e6-52649bee0184\") " pod="openshift-must-gather-jqggs/must-gather-4plx5" Apr 22 20:07:44.445233 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.445216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13cd4c49-abf7-4ece-89e6-52649bee0184-must-gather-output\") pod \"must-gather-4plx5\" (UID: \"13cd4c49-abf7-4ece-89e6-52649bee0184\") " pod="openshift-must-gather-jqggs/must-gather-4plx5" Apr 22 20:07:44.453924 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.453904 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vjbs\" (UniqueName: \"kubernetes.io/projected/13cd4c49-abf7-4ece-89e6-52649bee0184-kube-api-access-2vjbs\") pod \"must-gather-4plx5\" (UID: \"13cd4c49-abf7-4ece-89e6-52649bee0184\") " pod="openshift-must-gather-jqggs/must-gather-4plx5" Apr 22 20:07:44.597005 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.596921 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jqggs/must-gather-4plx5" Apr 22 20:07:44.711052 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.711025 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jqggs/must-gather-4plx5"] Apr 22 20:07:44.713473 ip-10-0-132-204 kubenswrapper[2577]: W0422 20:07:44.713442 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13cd4c49_abf7_4ece_89e6_52649bee0184.slice/crio-bdef4df12264ed3bea953481168e994fcf15c1dfe7beb6957d74177b9dfb3587 WatchSource:0}: Error finding container bdef4df12264ed3bea953481168e994fcf15c1dfe7beb6957d74177b9dfb3587: Status 404 returned error can't find the container with id bdef4df12264ed3bea953481168e994fcf15c1dfe7beb6957d74177b9dfb3587 Apr 22 20:07:44.715098 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:44.715082 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:07:45.332927 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:45.332895 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jqggs/must-gather-4plx5" event={"ID":"13cd4c49-abf7-4ece-89e6-52649bee0184","Type":"ContainerStarted","Data":"bdef4df12264ed3bea953481168e994fcf15c1dfe7beb6957d74177b9dfb3587"} Apr 22 20:07:50.350035 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:50.349999 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jqggs/must-gather-4plx5" event={"ID":"13cd4c49-abf7-4ece-89e6-52649bee0184","Type":"ContainerStarted","Data":"cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669"} Apr 22 20:07:50.350035 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:50.350038 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jqggs/must-gather-4plx5" event={"ID":"13cd4c49-abf7-4ece-89e6-52649bee0184","Type":"ContainerStarted","Data":"4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff"} Apr 22 20:07:50.366660 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:50.366582 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jqggs/must-gather-4plx5" podStartSLOduration=1.314874632 podStartE2EDuration="6.36656265s" podCreationTimestamp="2026-04-22 20:07:44 +0000 UTC" firstStartedPulling="2026-04-22 20:07:44.715222995 +0000 UTC m=+5490.740836656" lastFinishedPulling="2026-04-22 20:07:49.766911026 +0000 UTC m=+5495.792524674" observedRunningTime="2026-04-22 20:07:50.366172921 +0000 UTC m=+5496.391786589" watchObservedRunningTime="2026-04-22 20:07:50.36656265 +0000 UTC m=+5496.392176318" Apr 22 20:07:58.393563 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:58.393536 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:07:58.419606 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:58.419581 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:07:58.432001 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:58.431977 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:07:58.896440 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:58.896410 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:07:58.909043 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:58.909021 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:07:58.920983 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:58.920959 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:07:59.447473 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:59.447440 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:07:59.460924 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:59.460901 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:07:59.472753 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:59.472727 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:07:59.937062 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:59.936997 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:07:59.952770 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:59.952739 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:07:59.965245 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:07:59.965226 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:00.429406 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:00.429373 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:00.441539 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:00.441512 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:00.453202 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:00.453181 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:00.920253 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:00.920228 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:00.939553 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:00.939528 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:00.959859 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:00.959838 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:01.424174 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:01.424146 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:01.436737 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:01.436716 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:01.448426 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:01.448406 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:01.909970 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:01.909945 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:01.924843 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:01.924817 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:01.937995 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:01.937973 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:02.391135 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:02.391107 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:02.403820 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:02.403798 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:02.415646 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:02.415630 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:02.877566 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:02.877542 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:02.891312 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:02.891287 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:02.904490 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:02.904467 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:03.370068 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:03.370036 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:03.382680 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:03.382635 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:03.394682 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:03.394663 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:03.851029 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:03.851002 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:03.864738 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:03.864698 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:03.876357 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:03.876333 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:04.322804 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:04.322776 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:04.334716 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:04.334693 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:04.346082 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:04.346056 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:04.803940 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:04.803915 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:04.815389 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:04.815367 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:04.826535 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:04.826517 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:05.281048 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:05.281020 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:05.293335 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:05.293310 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:05.304646 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:05.304627 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:05.781992 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:05.781967 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:05.794558 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:05.794531 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:05.806391 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:05.806372 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:06.314562 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:06.314523 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:06.327254 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:06.327230 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:06.339473 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:06.339455 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:06.827380 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:06.827349 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:06.838667 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:06.838620 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:06.850161 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:06.850138 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:07.353088 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:07.353055 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:07.365150 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:07.365132 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:07.377426 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:07.377411 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:07.835498 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:07.835472 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:07.848134 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:07.848111 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:07.859837 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:07.859815 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:08.306701 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:08.306674 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/kserve-container/0.log" Apr 22 20:08:08.318914 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:08.318893 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/agent/0.log" Apr 22 20:08:08.330835 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:08.330813 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-raw-sklearn-batcher-ccfea-predictor-6d6f8b4979-6b8df_e931c0d3-7b89-4ce9-9f68-ed2e85b10e2f/storage-initializer/0.log" Apr 22 20:08:09.411424 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:09.411392 2577 generic.go:358] "Generic (PLEG): container finished" podID="13cd4c49-abf7-4ece-89e6-52649bee0184" containerID="4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff" exitCode=0 Apr 22 20:08:09.411838 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:09.411463 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jqggs/must-gather-4plx5" event={"ID":"13cd4c49-abf7-4ece-89e6-52649bee0184","Type":"ContainerDied","Data":"4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff"} Apr 22 20:08:09.411838 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:09.411770 2577 scope.go:117] "RemoveContainer" containerID="4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff" Apr 22 20:08:09.668011 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:09.667937 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jqggs_must-gather-4plx5_13cd4c49-abf7-4ece-89e6-52649bee0184/gather/0.log" Apr 22 20:08:12.943763 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:12.943736 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-98mhg_ab4eb5fe-f1c7-4f29-886f-c980c0fff0e0/global-pull-secret-syncer/0.log" Apr 22 20:08:13.210537 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:13.210462 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-w5xvx_60e1bb1d-071b-40c6-aeca-a00ab7fbb348/konnectivity-agent/0.log" Apr 22 20:08:13.297615 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:13.297584 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-204.ec2.internal_ffa9aef1222b3a9934de39cb15b8e512/haproxy/0.log" Apr 22 20:08:15.003588 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.003553 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jqggs/must-gather-4plx5"] Apr 22 20:08:15.004092 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.003818 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-jqggs/must-gather-4plx5" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" containerName="copy" containerID="cri-o://cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669" gracePeriod=2 Apr 22 20:08:15.005839 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.005818 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jqggs/must-gather-4plx5"] Apr 22 20:08:15.005935 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.005875 2577 status_manager.go:895] "Failed to get status for pod" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" pod="openshift-must-gather-jqggs/must-gather-4plx5" err="pods \"must-gather-4plx5\" is forbidden: User \"system:node:ip-10-0-132-204.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jqggs\": no relationship found between node 'ip-10-0-132-204.ec2.internal' and this object" Apr 22 20:08:15.231286 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.231265 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jqggs_must-gather-4plx5_13cd4c49-abf7-4ece-89e6-52649bee0184/copy/0.log" Apr 22 20:08:15.231616 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.231602 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jqggs/must-gather-4plx5" Apr 22 20:08:15.233908 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.233885 2577 status_manager.go:895] "Failed to get status for pod" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" pod="openshift-must-gather-jqggs/must-gather-4plx5" err="pods \"must-gather-4plx5\" is forbidden: User \"system:node:ip-10-0-132-204.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jqggs\": no relationship found between node 'ip-10-0-132-204.ec2.internal' and this object" Apr 22 20:08:15.293850 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.293797 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vjbs\" (UniqueName: \"kubernetes.io/projected/13cd4c49-abf7-4ece-89e6-52649bee0184-kube-api-access-2vjbs\") pod \"13cd4c49-abf7-4ece-89e6-52649bee0184\" (UID: \"13cd4c49-abf7-4ece-89e6-52649bee0184\") " Apr 22 20:08:15.293850 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.293828 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13cd4c49-abf7-4ece-89e6-52649bee0184-must-gather-output\") pod \"13cd4c49-abf7-4ece-89e6-52649bee0184\" (UID: \"13cd4c49-abf7-4ece-89e6-52649bee0184\") " Apr 22 20:08:15.295692 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.295668 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13cd4c49-abf7-4ece-89e6-52649bee0184-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "13cd4c49-abf7-4ece-89e6-52649bee0184" (UID: "13cd4c49-abf7-4ece-89e6-52649bee0184"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:08:15.295871 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.295851 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13cd4c49-abf7-4ece-89e6-52649bee0184-kube-api-access-2vjbs" (OuterVolumeSpecName: "kube-api-access-2vjbs") pod "13cd4c49-abf7-4ece-89e6-52649bee0184" (UID: "13cd4c49-abf7-4ece-89e6-52649bee0184"). InnerVolumeSpecName "kube-api-access-2vjbs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:08:15.394502 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.394473 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2vjbs\" (UniqueName: \"kubernetes.io/projected/13cd4c49-abf7-4ece-89e6-52649bee0184-kube-api-access-2vjbs\") on node \"ip-10-0-132-204.ec2.internal\" DevicePath \"\"" Apr 22 20:08:15.394502 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.394496 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13cd4c49-abf7-4ece-89e6-52649bee0184-must-gather-output\") on node \"ip-10-0-132-204.ec2.internal\" DevicePath \"\"" Apr 22 20:08:15.428295 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.428262 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jqggs_must-gather-4plx5_13cd4c49-abf7-4ece-89e6-52649bee0184/copy/0.log" Apr 22 20:08:15.428618 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.428595 2577 generic.go:358] "Generic (PLEG): container finished" podID="13cd4c49-abf7-4ece-89e6-52649bee0184" containerID="cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669" exitCode=143 Apr 22 20:08:15.428698 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.428646 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jqggs/must-gather-4plx5" Apr 22 20:08:15.428754 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.428704 2577 scope.go:117] "RemoveContainer" containerID="cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669" Apr 22 20:08:15.431056 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.431032 2577 status_manager.go:895] "Failed to get status for pod" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" pod="openshift-must-gather-jqggs/must-gather-4plx5" err="pods \"must-gather-4plx5\" is forbidden: User \"system:node:ip-10-0-132-204.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jqggs\": no relationship found between node 'ip-10-0-132-204.ec2.internal' and this object" Apr 22 20:08:15.436283 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.436260 2577 scope.go:117] "RemoveContainer" containerID="4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff" Apr 22 20:08:15.438699 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.438669 2577 status_manager.go:895] "Failed to get status for pod" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" pod="openshift-must-gather-jqggs/must-gather-4plx5" err="pods \"must-gather-4plx5\" is forbidden: User \"system:node:ip-10-0-132-204.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jqggs\": no relationship found between node 'ip-10-0-132-204.ec2.internal' and this object" Apr 22 20:08:15.447676 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.447644 2577 scope.go:117] "RemoveContainer" containerID="cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669" Apr 22 20:08:15.447929 ip-10-0-132-204 kubenswrapper[2577]: E0422 20:08:15.447909 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669\": container with ID starting with cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669 not found: ID does not exist" containerID="cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669" Apr 22 20:08:15.447968 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.447939 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669"} err="failed to get container status \"cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669\": rpc error: code = NotFound desc = could not find container \"cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669\": container with ID starting with cda6a8576861f7ed3e92502bb5c86ec32977cc481062920d0b20db134005b669 not found: ID does not exist" Apr 22 20:08:15.447968 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.447959 2577 scope.go:117] "RemoveContainer" containerID="4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff" Apr 22 20:08:15.448206 ip-10-0-132-204 kubenswrapper[2577]: E0422 20:08:15.448187 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff\": container with ID starting with 4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff not found: ID does not exist" containerID="4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff" Apr 22 20:08:15.448241 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:15.448214 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff"} err="failed to get container status \"4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff\": rpc error: code = NotFound desc = could not find container \"4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff\": container with ID starting with 4ba2d000a7165d36e201e62331226ff77c62649ac04b955862a33d8523d0c0ff not found: ID does not exist" Apr 22 20:08:16.602419 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:16.602387 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" path="/var/lib/kubelet/pods/13cd4c49-abf7-4ece-89e6-52649bee0184/volumes" Apr 22 20:08:17.073482 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:17.073451 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k4rxg_7c28ba8d-9997-484f-aee0-0eea8d4cad96/node-exporter/0.log" Apr 22 20:08:17.096417 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:17.096393 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k4rxg_7c28ba8d-9997-484f-aee0-0eea8d4cad96/kube-rbac-proxy/0.log" Apr 22 20:08:17.139565 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:17.139502 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k4rxg_7c28ba8d-9997-484f-aee0-0eea8d4cad96/init-textfile/0.log" Apr 22 20:08:17.362850 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:17.362825 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd45a847-7032-495b-b51e-b9e0b7a5ae87/prometheus/0.log" Apr 22 20:08:17.382424 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:17.382402 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd45a847-7032-495b-b51e-b9e0b7a5ae87/config-reloader/0.log" Apr 22 20:08:17.407144 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:17.407086 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd45a847-7032-495b-b51e-b9e0b7a5ae87/thanos-sidecar/0.log" Apr 22 20:08:17.431658 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:17.431631 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd45a847-7032-495b-b51e-b9e0b7a5ae87/kube-rbac-proxy-web/0.log" Apr 22 20:08:17.456337 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:17.456304 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd45a847-7032-495b-b51e-b9e0b7a5ae87/kube-rbac-proxy/0.log" Apr 22 20:08:17.481289 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:17.481266 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd45a847-7032-495b-b51e-b9e0b7a5ae87/kube-rbac-proxy-thanos/0.log" Apr 22 20:08:17.511784 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:17.511762 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd45a847-7032-495b-b51e-b9e0b7a5ae87/init-config-reloader/0.log" Apr 22 20:08:17.594959 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:17.594933 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-hxdbt_408f3706-1f36-4dbb-83f7-1fd0bbfec3bb/prometheus-operator-admission-webhook/0.log" Apr 22 20:08:18.932208 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:18.932133 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-ngrmj_26b34a86-850d-4fe7-87a3-10448b9b73a3/networking-console-plugin/0.log" Apr 22 20:08:19.769196 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.769166 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-m9jlg_48628ae2-0b18-47ec-a9fe-e7dd053efb08/download-server/0.log" Apr 22 20:08:19.865793 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.865759 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz"] Apr 22 20:08:19.866057 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.866045 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" containerName="copy" Apr 22 20:08:19.866105 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.866058 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" containerName="copy" Apr 22 20:08:19.866105 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.866070 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" containerName="gather" Apr 22 20:08:19.866105 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.866076 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" containerName="gather" Apr 22 20:08:19.866200 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.866131 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" containerName="copy" Apr 22 20:08:19.866200 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.866142 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="13cd4c49-abf7-4ece-89e6-52649bee0184" containerName="gather" Apr 22 20:08:19.871136 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.871117 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:19.873497 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.873475 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k2gsp\"/\"kube-root-ca.crt\"" Apr 22 20:08:19.873611 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.873588 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k2gsp\"/\"openshift-service-ca.crt\"" Apr 22 20:08:19.874829 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.874808 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-k2gsp\"/\"default-dockercfg-dszbq\"" Apr 22 20:08:19.878470 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:19.878412 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz"] Apr 22 20:08:20.033152 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.033058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-proc\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.033152 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.033093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-podres\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.033152 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.033114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n4m7\" (UniqueName: \"kubernetes.io/projected/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-kube-api-access-2n4m7\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.033152 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.033134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-lib-modules\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.033644 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.033200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-sys\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.134452 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.134415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-proc\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.134670 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.134461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-podres\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.134670 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.134492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2n4m7\" (UniqueName: \"kubernetes.io/projected/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-kube-api-access-2n4m7\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.134670 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.134525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-lib-modules\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.134670 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.134540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-proc\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.134670 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.134567 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-sys\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.134871 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.134688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-podres\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.134871 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.134706 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-sys\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.134871 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.134725 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-lib-modules\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.143288 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.143263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n4m7\" (UniqueName: \"kubernetes.io/projected/2f50f274-6c2c-434a-97d4-9bd17d8b4d1d-kube-api-access-2n4m7\") pod \"perf-node-gather-daemonset-qthrz\" (UID: \"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.182923 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.182882 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:20.302141 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.302066 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz"] Apr 22 20:08:20.305291 ip-10-0-132-204 kubenswrapper[2577]: W0422 20:08:20.305262 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2f50f274_6c2c_434a_97d4_9bd17d8b4d1d.slice/crio-465f3fa340af6097096d2c5b96c80df64264c2bb406a83adfecdccad9180803c WatchSource:0}: Error finding container 465f3fa340af6097096d2c5b96c80df64264c2bb406a83adfecdccad9180803c: Status 404 returned error can't find the container with id 465f3fa340af6097096d2c5b96c80df64264c2bb406a83adfecdccad9180803c Apr 22 20:08:20.447287 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.447260 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" event={"ID":"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d","Type":"ContainerStarted","Data":"465f3fa340af6097096d2c5b96c80df64264c2bb406a83adfecdccad9180803c"} Apr 22 20:08:20.852356 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.852324 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-msgd4_9e36af8a-7247-492e-8451-b00362b2dcac/dns/0.log" Apr 22 20:08:20.876376 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:20.876342 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-msgd4_9e36af8a-7247-492e-8451-b00362b2dcac/kube-rbac-proxy/0.log" Apr 22 20:08:21.006323 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:21.006295 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q9vgr_5f2ded72-fa8f-446c-863c-2699e86ba162/dns-node-resolver/0.log" Apr 22 20:08:21.451339 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:21.451305 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" event={"ID":"2f50f274-6c2c-434a-97d4-9bd17d8b4d1d","Type":"ContainerStarted","Data":"9f38dc478a520aa2b91446bca9d6f4c6e4ac1d2b69467fec50bc45dcc3d76791"} Apr 22 20:08:21.451807 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:21.451453 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:21.468228 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:21.468184 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" podStartSLOduration=2.468170988 podStartE2EDuration="2.468170988s" podCreationTimestamp="2026-04-22 20:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:08:21.466968085 +0000 UTC m=+5527.492581759" watchObservedRunningTime="2026-04-22 20:08:21.468170988 +0000 UTC m=+5527.493784649" Apr 22 20:08:21.476344 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:21.476317 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-b9dcb7fbd-ss8dh_519eae9a-0267-4d70-8881-49f12042815a/registry/0.log" Apr 22 20:08:21.500301 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:21.500276 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fgx5l_5fcd2fa1-b6e0-4480-a123-665c7fabd2bf/node-ca/0.log" Apr 22 20:08:22.645138 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:22.645111 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-k9tbm_3d556130-66df-4a4c-baae-5f7294e0bc35/serve-healthcheck-canary/0.log" Apr 22 20:08:23.084524 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:23.084485 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-btj9l_b623c724-516c-4927-bf88-44d34adfbf95/kube-rbac-proxy/0.log" Apr 22 20:08:23.110346 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:23.110317 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-btj9l_b623c724-516c-4927-bf88-44d34adfbf95/exporter/0.log" Apr 22 20:08:23.135509 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:23.135488 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-btj9l_b623c724-516c-4927-bf88-44d34adfbf95/extractor/0.log" Apr 22 20:08:25.296635 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:25.296604 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-h2t5j_f4f5ea8e-b41a-47a6-ab1c-c19ecadedc95/s3-init/0.log" Apr 22 20:08:25.325730 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:25.325700 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-97mxz_25b6955b-2295-487f-9600-8e972c3be106/seaweedfs/0.log" Apr 22 20:08:27.463530 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:27.463499 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-qthrz" Apr 22 20:08:29.291396 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:29.291373 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-98ph8_cc3ce331-86aa-4c86-b73e-a6350ae8d3a1/migrator/0.log" Apr 22 20:08:29.319632 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:29.319609 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-98ph8_cc3ce331-86aa-4c86-b73e-a6350ae8d3a1/graceful-termination/0.log" Apr 22 20:08:30.703821 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:30.703793 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2qq9r_8f88ea3a-0984-4a92-a346-4c9f9b5cbdc2/kube-multus/0.log" Apr 22 20:08:30.913188 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:30.913160 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knnz5_1ca4dacd-dae4-4a9c-a9f6-96d152edbeac/kube-multus-additional-cni-plugins/0.log" Apr 22 20:08:30.957456 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:30.957422 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knnz5_1ca4dacd-dae4-4a9c-a9f6-96d152edbeac/egress-router-binary-copy/0.log" Apr 22 20:08:31.008031 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:31.008008 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knnz5_1ca4dacd-dae4-4a9c-a9f6-96d152edbeac/cni-plugins/0.log" Apr 22 20:08:31.052351 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:31.052283 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knnz5_1ca4dacd-dae4-4a9c-a9f6-96d152edbeac/bond-cni-plugin/0.log" Apr 22 20:08:31.095372 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:31.095344 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knnz5_1ca4dacd-dae4-4a9c-a9f6-96d152edbeac/routeoverride-cni/0.log" Apr 22 20:08:31.140362 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:31.140335 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knnz5_1ca4dacd-dae4-4a9c-a9f6-96d152edbeac/whereabouts-cni-bincopy/0.log" Apr 22 20:08:31.192348 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:31.192315 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knnz5_1ca4dacd-dae4-4a9c-a9f6-96d152edbeac/whereabouts-cni/0.log" Apr 22 20:08:31.510071 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:31.510044 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9t8m2_617c16ac-507f-45a2-ab75-d583c7798ca1/network-metrics-daemon/0.log" Apr 22 20:08:31.559341 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:31.559313 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9t8m2_617c16ac-507f-45a2-ab75-d583c7798ca1/kube-rbac-proxy/0.log" Apr 22 20:08:32.912359 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:32.912331 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-controller/0.log" Apr 22 20:08:32.936737 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:32.936694 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/0.log" Apr 22 20:08:32.962934 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:32.962906 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovn-acl-logging/1.log" Apr 22 20:08:32.983426 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:32.983404 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/kube-rbac-proxy-node/0.log" Apr 22 20:08:33.008278 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:33.008257 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:08:33.032403 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:33.032375 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/northd/0.log" Apr 22 20:08:33.058111 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:33.058086 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/nbdb/0.log" Apr 22 20:08:33.082930 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:33.082908 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/sbdb/0.log" Apr 22 20:08:33.178830 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:33.178757 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r65js_97302881-ab30-4630-9df7-e7796d6aaedf/ovnkube-controller/0.log" Apr 22 20:08:34.461917 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:34.461891 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dt6rs_a97e9a8d-908d-409a-b079-4fee4c52cdcd/network-check-target-container/0.log" Apr 22 20:08:35.457925 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:35.457900 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xhp7q_b6d9430f-dd4a-4cf0-9c59-1a20bf462166/iptables-alerter/0.log" Apr 22 20:08:36.138400 ip-10-0-132-204 kubenswrapper[2577]: I0422 20:08:36.138371 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-l256r_66e01385-0150-4908-a5b3-deafda8e4e26/tuned/0.log"