Apr 16 18:15:28.519809 ip-10-0-130-205 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:15:28.982298 ip-10-0-130-205 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:15:28.982298 ip-10-0-130-205 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:15:28.982298 ip-10-0-130-205 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:15:28.982298 ip-10-0-130-205 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:15:28.982298 ip-10-0-130-205 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:15:28.983419 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.982345 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:15:28.988217 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988200 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:15:28.988217 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988215 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988220 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988223 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988226 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988229 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988232 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988235 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988237 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988241 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988243 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988246 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988249 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988251 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988254 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988256 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988259 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988262 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988268 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988273 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988277 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:15:28.988281 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988280 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988282 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988285 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988288 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988291 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988294 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988296 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988299 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988301 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988304 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988307 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988309 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988312 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988314 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988318 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988322 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988325 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988327 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988330 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:15:28.988770 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988332 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988335 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988338 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988340 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988343 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988346 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988348 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988351 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988353 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988355 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988358 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988361 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988364 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988366 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988369 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988372 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988374 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988377 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988380 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988382 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:15:28.989197 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988385 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988388 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988390 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988392 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988395 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988397 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988400 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988402 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988405 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988407 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988410 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988412 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988415 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988417 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988421 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988423 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988425 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988428 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988430 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988432 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:15:28.989737 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988435 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988437 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988440 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988442 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988445 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988448 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988837 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988844 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988847 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988849 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988852 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988855 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988857 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988860 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988863 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988865 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988867 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988870 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988872 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988875 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:15:28.990188 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988877 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988880 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988882 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988885 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988887 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988890 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988892 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988895 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988897 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988900 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988903 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988906 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988908 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988911 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988915 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988920 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988928 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988931 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988934 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988938 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:15:28.990648 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988941 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988943 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988946 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988949 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988951 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988954 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988957 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988960 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988962 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988965 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988967 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988970 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988972 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988975 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988978 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988980 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988982 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988985 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988987 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988990 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:15:28.991127 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988992 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988995 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988997 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.988999 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989002 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989005 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989007 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989010 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989013 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989015 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989018 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989021 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989023 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989026 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989028 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989031 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989033 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989036 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989040 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:15:28.991611 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989043 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989047 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989049 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989052 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989054 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989056 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989059 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989061 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989064 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989066 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989068 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989071 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.989074 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990252 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990262 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990268 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990273 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990277 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990280 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990285 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990289 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990292 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:15:28.992052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990295 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990299 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990303 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990306 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990309 2573 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990312 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990315 2573 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990318 2573 flags.go:64] FLAG: --cloud-config="" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990320 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990323 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990327 2573 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990330 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990333 2573 flags.go:64] FLAG: --config-dir="" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990335 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990339 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990343 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990346 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990349 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990352 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990355 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990358 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990375 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990379 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990382 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990387 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:15:28.992578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990390 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990393 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990395 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990398 2573 flags.go:64] FLAG: --enable-server="true" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990401 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990406 2573 flags.go:64] FLAG: --event-burst="100" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990409 2573 flags.go:64] FLAG: --event-qps="50" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990411 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990415 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990419 2573 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990423 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990426 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990429 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990432 2573 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990435 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990438 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990441 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990444 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990447 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990449 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990452 2573 flags.go:64] FLAG: --feature-gates="" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990456 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990459 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990462 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990465 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990469 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:15:28.993158 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990471 2573 flags.go:64] FLAG: --help="false" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990476 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-130-205.ec2.internal" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990479 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990481 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990484 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990487 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990490 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990493 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990496 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990499 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990502 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990504 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990507 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990510 2573 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990526 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990529 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990532 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990535 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990538 2573 flags.go:64] FLAG: --lock-file="" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990541 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990544 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990547 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990556 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:15:28.993767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990559 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990561 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990564 2573 flags.go:64] FLAG: --logging-format="text" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990567 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990570 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990573 2573 flags.go:64] FLAG: --manifest-url="" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990575 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990580 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990583 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990587 2573 flags.go:64] FLAG: --max-pods="110" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990591 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990594 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990596 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990599 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990602 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990605 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990608 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990615 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990618 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990621 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990625 2573 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990628 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990634 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990636 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:15:28.994316 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990639 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990642 2573 flags.go:64] FLAG: --port="10250" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990646 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990649 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-008e344ec8b17bd15" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990652 2573 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990655 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990658 2573 flags.go:64] FLAG: --register-node="true" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990661 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990664 2573 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990667 2573 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990670 2573 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990673 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990676 2573 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990679 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990682 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990685 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990689 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990691 2573 flags.go:64] FLAG: --runonce="false" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990694 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990697 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990700 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990703 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990706 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990709 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990712 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990715 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:15:28.994869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990718 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990720 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990723 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990726 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990729 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990732 2573 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990734 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990740 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990742 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990747 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990751 2573 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990753 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990756 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990759 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990762 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990764 2573 flags.go:64] FLAG: --v="2" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990769 2573 flags.go:64] FLAG: --version="false" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990773 2573 flags.go:64] FLAG: --vmodule="" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990777 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.990780 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990889 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990893 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990896 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990899 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:15:28.995496 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990902 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990904 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990907 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990910 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990913 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990915 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990918 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990921 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990923 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990926 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990928 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990931 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990933 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990936 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990938 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990941 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990943 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990946 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990949 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990951 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:15:28.996061 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990953 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990956 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990958 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990960 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990963 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990965 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990967 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990970 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990972 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990975 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990979 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990982 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990984 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990988 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990991 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990993 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990996 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.990998 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991001 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991003 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:15:28.996584 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991006 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991009 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991013 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991016 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991018 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991021 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991024 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991026 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991030 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991033 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991036 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991039 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991042 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991044 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991047 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991049 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991051 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991054 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991056 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:15:28.997097 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991058 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991061 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991063 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991067 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991069 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991072 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991075 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991078 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991080 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991083 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991085 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991087 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991090 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991092 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991095 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991097 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991100 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991102 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991105 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991107 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:15:28.997820 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991110 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:15:28.998482 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991112 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:15:28.998482 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.991114 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:15:28.998482 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.991914 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:15:28.999931 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.999913 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:15:28.999970 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:28.999932 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:15:28.999998 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.999982 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:15:28.999998 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.999987 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:15:28.999998 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.999990 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:15:28.999998 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.999993 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:15:28.999998 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.999996 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:15:28.999998 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:28.999998 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000002 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000005 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000008 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000011 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000013 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000016 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000018 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000021 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000023 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000026 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000028 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000030 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000033 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000036 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000039 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000041 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000044 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000047 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:15:29.000176 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000049 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000052 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000054 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000056 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000060 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000064 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000067 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000070 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000073 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000076 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000079 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000083 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000085 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000088 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000090 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000094 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000097 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000100 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000102 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:15:29.000656 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000105 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000107 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000110 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000112 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000115 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000117 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000120 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000122 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000124 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000127 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000129 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000132 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000134 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000137 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000139 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000141 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000144 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000147 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000149 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000152 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:15:29.001116 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000154 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000157 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000159 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000161 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000164 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000166 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000168 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000171 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000174 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000176 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000179 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000182 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000184 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000188 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000190 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000193 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000195 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000198 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000200 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000202 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:15:29.001694 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000205 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000207 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000209 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.000214 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000307 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000312 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000315 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000317 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000320 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000323 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000326 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000328 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000331 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000334 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000337 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000339 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:15:29.002194 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000342 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000344 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000347 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000350 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000353 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000355 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000357 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000360 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000362 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000365 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000367 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000370 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000373 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000375 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000377 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000380 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000382 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000385 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000387 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000389 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:15:29.002601 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000392 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000394 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000396 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000399 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000401 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000404 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000406 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000410 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000414 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000417 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000419 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000421 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000424 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000426 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000429 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000431 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000434 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000436 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000439 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000441 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:15:29.003073 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000443 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000446 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000448 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000451 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000453 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000456 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000458 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000461 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000463 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000466 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000468 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000471 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000473 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000475 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000478 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000480 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000484 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000488 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000491 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:15:29.003599 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000493 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000496 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000499 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000501 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000503 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000506 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000508 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000531 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000535 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000538 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000541 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000543 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000546 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000548 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:29.000550 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.000555 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:15:29.004052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.000675 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:15:29.004461 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.003335 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:15:29.004461 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.004304 2573 server.go:1019] "Starting client certificate rotation" Apr 16 18:15:29.004461 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.004415 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:15:29.004613 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.004463 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:15:29.028894 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.028871 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:15:29.031559 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.031534 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:15:29.045867 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.045848 2573 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:15:29.050938 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.050924 2573 log.go:25] "Validated CRI v1 image API" Apr 16 18:15:29.053711 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.053637 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:15:29.061886 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.061863 2573 fs.go:135] Filesystem UUIDs: map[179d6898-1a24-4510-b610-034e0b85eea1:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 8ca4f6da-0c4d-402c-a4b4-e01b528839ba:/dev/nvme0n1p3] Apr 16 18:15:29.061948 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.061886 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:15:29.062274 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.062258 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:15:29.067932 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.067822 2573 manager.go:217] Machine: {Timestamp:2026-04-16 18:15:29.065824722 +0000 UTC m=+0.421966773 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3198828 MemoryCapacity:32812158976 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20cf7845a1c03fca3fcf3943ef695f SystemUUID:ec20cf78-45a1-c03f-ca3f-cf3943ef695f BootID:d3102d3d-6341-4206-ad95-b2b02d8f9925 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406077440 Type:vfs Inodes:4005390 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:59:4e:e9:2d:b1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:59:4e:e9:2d:b1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:aa:80:ba:15:f6:75 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812158976 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:15:29.067932 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.067928 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:15:29.068040 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.068006 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:15:29.068389 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.068366 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:15:29.068532 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.068391 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-205.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:15:29.068572 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.068542 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:15:29.068572 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.068549 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:15:29.068572 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.068562 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:15:29.069284 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.069274 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:15:29.070010 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.070001 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:15:29.070117 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.070108 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:15:29.072467 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.072458 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:15:29.072498 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.072471 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:15:29.072498 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.072483 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:15:29.072498 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.072492 2573 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:15:29.072620 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.072501 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:15:29.073611 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.073600 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:15:29.073646 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.073617 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:15:29.076755 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.076738 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:15:29.078286 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.078267 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:15:29.079501 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079486 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:15:29.079586 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079528 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:15:29.079586 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079541 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:15:29.079586 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079554 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:15:29.079586 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079564 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:15:29.079586 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079577 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:15:29.079586 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079587 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:15:29.079838 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079612 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:15:29.079838 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079624 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:15:29.079838 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079633 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:15:29.079838 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079652 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:15:29.079838 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.079666 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:15:29.080495 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.080484 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:15:29.080564 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.080499 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:15:29.084045 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.084030 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:15:29.084120 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.084073 2573 server.go:1295] "Started kubelet" Apr 16 18:15:29.084173 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.084132 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:15:29.084282 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.084234 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:15:29.084343 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.084330 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:15:29.084923 ip-10-0-130-205 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:15:29.085583 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.085534 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:15:29.085807 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.085784 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-205.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:15:29.085927 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.085815 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-205.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:15:29.086115 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.086021 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:15:29.086942 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.086926 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:15:29.090469 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.090455 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:15:29.090570 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.090502 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:15:29.091249 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.091233 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:15:29.091249 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.091250 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:15:29.091371 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.091276 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:15:29.091619 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.091588 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:29.091933 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.091917 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:15:29.091933 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.091929 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:15:29.092697 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.092666 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:15:29.092697 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.092697 2573 factory.go:55] Registering systemd factory Apr 16 18:15:29.092815 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.092706 2573 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:15:29.094053 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.094025 2573 factory.go:153] Registering CRI-O factory Apr 16 18:15:29.094053 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.094047 2573 factory.go:223] Registration of the crio container factory successfully Apr 16 18:15:29.094183 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.094074 2573 factory.go:103] Registering Raw factory Apr 16 18:15:29.094183 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.094088 2573 manager.go:1196] Started watching for new ooms in manager Apr 16 18:15:29.094391 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.094368 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:15:29.094746 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.094730 2573 manager.go:319] Starting recovery of all containers Apr 16 18:15:29.099072 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.099048 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-205.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:15:29.099072 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.099059 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:15:29.100211 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.099141 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-205.ec2.internal.18a6e9094615b108 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-205.ec2.internal,UID:ip-10-0-130-205.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-205.ec2.internal,},FirstTimestamp:2026-04-16 18:15:29.084043528 +0000 UTC m=+0.440185578,LastTimestamp:2026-04-16 18:15:29.084043528 +0000 UTC m=+0.440185578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-205.ec2.internal,}" Apr 16 18:15:29.105381 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.105263 2573 manager.go:324] Recovery completed Apr 16 18:15:29.106572 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.106475 2573 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 18:15:29.109705 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.109692 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:29.111866 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.111839 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:29.111935 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.111877 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:29.111935 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.111888 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:29.112674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.112657 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:15:29.112674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.112671 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:15:29.112802 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.112690 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:15:29.114392 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.114327 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-205.ec2.internal.18a6e90947be33ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-205.ec2.internal,UID:ip-10-0-130-205.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-205.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-205.ec2.internal,},FirstTimestamp:2026-04-16 18:15:29.1118643 +0000 UTC m=+0.468006352,LastTimestamp:2026-04-16 18:15:29.1118643 +0000 UTC m=+0.468006352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-205.ec2.internal,}" Apr 16 18:15:29.115812 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.115800 2573 policy_none.go:49] "None policy: Start" Apr 16 18:15:29.115858 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.115817 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:15:29.115858 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.115826 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:15:29.125197 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.125180 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-45t99" Apr 16 18:15:29.127090 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.127025 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-205.ec2.internal.18a6e90947be7ba9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-205.ec2.internal,UID:ip-10-0-130-205.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-130-205.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-130-205.ec2.internal,},FirstTimestamp:2026-04-16 18:15:29.111882665 +0000 UTC m=+0.468024711,LastTimestamp:2026-04-16 18:15:29.111882665 +0000 UTC m=+0.468024711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-205.ec2.internal,}" Apr 16 18:15:29.137544 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.137508 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-45t99" Apr 16 18:15:29.138009 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.137932 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-205.ec2.internal.18a6e90947be9ec0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-205.ec2.internal,UID:ip-10-0-130-205.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-130-205.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-130-205.ec2.internal,},FirstTimestamp:2026-04-16 18:15:29.111891648 +0000 UTC m=+0.468033695,LastTimestamp:2026-04-16 18:15:29.111891648 +0000 UTC m=+0.468033695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-205.ec2.internal,}" Apr 16 18:15:29.157064 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.157041 2573 manager.go:341] "Starting Device Plugin manager" Apr 16 18:15:29.175078 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.157081 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:15:29.175078 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.157094 2573 server.go:85] "Starting device plugin registration server" Apr 16 18:15:29.175078 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.157312 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:15:29.175078 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.157321 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:15:29.175078 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.157903 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:15:29.175078 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.157990 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:15:29.175078 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.157998 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:15:29.175078 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.158075 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:15:29.175078 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.158109 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:29.215354 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.215314 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:15:29.216573 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.216557 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:15:29.216654 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.216590 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:15:29.216654 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.216614 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:15:29.216654 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.216632 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:15:29.216753 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.216673 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:15:29.219901 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.219877 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:15:29.258162 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.258103 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:29.259812 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.259786 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:29.259895 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.259821 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:29.259895 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.259832 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:29.259895 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.259853 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.270812 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.270793 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.270866 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.270820 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-205.ec2.internal\": node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:29.299740 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.299718 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:29.317067 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.317044 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-205.ec2.internal"] Apr 16 18:15:29.317125 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.317106 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:29.317967 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.317951 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:29.318015 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.317980 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:29.318015 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.317990 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:29.320345 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.320333 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:29.320479 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.320466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.320542 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.320493 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:29.320988 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.320974 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:29.321053 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.320999 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:29.321053 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.321008 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:29.321921 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.321905 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:29.321979 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.321933 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:29.321979 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.321942 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:29.323663 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.323647 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.323747 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.323675 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:29.324306 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.324292 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:29.324384 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.324314 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:29.324384 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.324323 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:29.349087 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.349067 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-205.ec2.internal\" not found" node="ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.353256 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.353242 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-205.ec2.internal\" not found" node="ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.393861 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.393830 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7a34eecb8d6e5f391acffbd465bccedd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal\" (UID: \"7a34eecb8d6e5f391acffbd465bccedd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.393861 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.393866 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a34eecb8d6e5f391acffbd465bccedd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal\" (UID: \"7a34eecb8d6e5f391acffbd465bccedd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.394009 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.393883 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f9f7c92f985c41749a276f7d4f6cddbd-config\") pod \"kube-apiserver-proxy-ip-10-0-130-205.ec2.internal\" (UID: \"f9f7c92f985c41749a276f7d4f6cddbd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.399935 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.399915 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:29.494613 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.494587 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f9f7c92f985c41749a276f7d4f6cddbd-config\") pod \"kube-apiserver-proxy-ip-10-0-130-205.ec2.internal\" (UID: \"f9f7c92f985c41749a276f7d4f6cddbd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.494613 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.494615 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7a34eecb8d6e5f391acffbd465bccedd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal\" (UID: \"7a34eecb8d6e5f391acffbd465bccedd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.494801 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.494635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a34eecb8d6e5f391acffbd465bccedd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal\" (UID: \"7a34eecb8d6e5f391acffbd465bccedd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.494801 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.494705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7a34eecb8d6e5f391acffbd465bccedd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal\" (UID: \"7a34eecb8d6e5f391acffbd465bccedd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.494801 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.494772 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f9f7c92f985c41749a276f7d4f6cddbd-config\") pod \"kube-apiserver-proxy-ip-10-0-130-205.ec2.internal\" (UID: \"f9f7c92f985c41749a276f7d4f6cddbd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.494894 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.494826 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a34eecb8d6e5f391acffbd465bccedd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal\" (UID: \"7a34eecb8d6e5f391acffbd465bccedd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.500212 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.500197 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:29.600560 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.600480 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:29.652445 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.652418 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.655458 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:29.655441 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-205.ec2.internal" Apr 16 18:15:29.700891 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.700862 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:29.801416 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.801384 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:29.901921 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:29.901889 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:30.002400 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:30.002373 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:30.004534 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.004504 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:15:30.004668 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.004652 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:15:30.090972 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.090945 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:15:30.102733 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:30.102715 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:30.121355 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.121337 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:15:30.140095 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.140062 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:10:29 +0000 UTC" deadline="2027-10-12 10:00:07.125550145 +0000 UTC" Apr 16 18:15:30.140151 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.140094 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13047h44m36.985458864s" Apr 16 18:15:30.167759 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.167696 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:15:30.197926 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.197898 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2cs2w" Apr 16 18:15:30.203067 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:30.203051 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-205.ec2.internal\" not found" Apr 16 18:15:30.209446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.209430 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2cs2w" Apr 16 18:15:30.232993 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.232974 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:15:30.291826 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.291796 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" Apr 16 18:15:30.312318 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.312286 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:15:30.321749 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.321730 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:15:30.323375 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.323360 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-205.ec2.internal" Apr 16 18:15:30.330638 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:30.330611 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a34eecb8d6e5f391acffbd465bccedd.slice/crio-238186f8658f49937f95776314b9881ecd2cc956b4b01e8df81f9a0ef6703c32 WatchSource:0}: Error finding container 238186f8658f49937f95776314b9881ecd2cc956b4b01e8df81f9a0ef6703c32: Status 404 returned error can't find the container with id 238186f8658f49937f95776314b9881ecd2cc956b4b01e8df81f9a0ef6703c32 Apr 16 18:15:30.331074 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:30.331051 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f7c92f985c41749a276f7d4f6cddbd.slice/crio-86ee2785967ffb40e7a37be0f7bd9c3d133fd7fc7a98fdd550b7da71938572ee WatchSource:0}: Error finding container 86ee2785967ffb40e7a37be0f7bd9c3d133fd7fc7a98fdd550b7da71938572ee: Status 404 returned error can't find the container with id 86ee2785967ffb40e7a37be0f7bd9c3d133fd7fc7a98fdd550b7da71938572ee Apr 16 18:15:30.335752 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.335739 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:15:30.346660 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:30.346636 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:15:31.073349 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.073313 2573 apiserver.go:52] "Watching apiserver" Apr 16 18:15:31.089090 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.088735 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:15:31.090909 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.090880 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-205.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl","openshift-dns/node-resolver-jl2lc","openshift-multus/multus-additional-cni-plugins-5xfmp","openshift-network-operator/iptables-alerter-9rvl2","openshift-cluster-node-tuning-operator/tuned-69hrm","openshift-image-registry/node-ca-kd96x","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal","openshift-multus/multus-6sjm7","openshift-multus/network-metrics-daemon-j9gkk","openshift-network-diagnostics/network-check-target-4cbj8","openshift-ovn-kubernetes/ovnkube-node-slsjs","kube-system/konnectivity-agent-7v7nr"] Apr 16 18:15:31.094627 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.093898 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kd96x" Apr 16 18:15:31.096698 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.096109 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.098602 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.098579 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:15:31.098698 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.098674 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:15:31.098698 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.098682 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q9cpn\"" Apr 16 18:15:31.098698 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.098687 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-r6pjn\"" Apr 16 18:15:31.098843 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.098581 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:15:31.098843 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.098587 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:15:31.098843 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.098818 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:15:31.098972 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.098926 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:15:31.100562 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.100543 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jl2lc" Apr 16 18:15:31.102709 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.102554 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4ae6d82-301f-44be-85ce-8d3b88e0d6e1-host\") pod \"node-ca-kd96x\" (UID: \"e4ae6d82-301f-44be-85ce-8d3b88e0d6e1\") " pod="openshift-image-registry/node-ca-kd96x" Apr 16 18:15:31.102709 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.102591 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e4ae6d82-301f-44be-85ce-8d3b88e0d6e1-serviceca\") pod \"node-ca-kd96x\" (UID: \"e4ae6d82-301f-44be-85ce-8d3b88e0d6e1\") " pod="openshift-image-registry/node-ca-kd96x" Apr 16 18:15:31.102709 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.102622 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.102709 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.102651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkg4f\" (UniqueName: \"kubernetes.io/projected/e4ae6d82-301f-44be-85ce-8d3b88e0d6e1-kube-api-access-mkg4f\") pod \"node-ca-kd96x\" (UID: \"e4ae6d82-301f-44be-85ce-8d3b88e0d6e1\") " pod="openshift-image-registry/node-ca-kd96x" Apr 16 18:15:31.103842 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.103824 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9rvl2" Apr 16 18:15:31.105819 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.105791 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8n2dn\"" Apr 16 18:15:31.106351 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.106158 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:15:31.106351 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.106223 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:15:31.111529 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.111079 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.115174 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.114437 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.115174 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.114597 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:15:31.115174 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.114895 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:15:31.115174 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.114935 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-72ndk\"" Apr 16 18:15:31.115174 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.115172 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:15:31.115448 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.115376 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:15:31.115763 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.115746 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:15:31.119324 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.118037 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:31.119324 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.118110 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:31.119324 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.118139 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:15:31.119324 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.118572 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dpdl8\"" Apr 16 18:15:31.119324 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.118686 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:15:31.119324 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.118764 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:15:31.119324 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.118953 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:15:31.119324 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.118993 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:15:31.119324 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.119194 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vcbcz\"" Apr 16 18:15:31.121109 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.121087 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.123957 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.123591 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:31.123957 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.123617 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:31.123957 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.123670 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:31.124323 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.124304 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:15:31.125529 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.125498 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:15:31.125881 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.125864 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:15:31.126055 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.125869 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r9x9j\"" Apr 16 18:15:31.127031 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.126890 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:15:31.127031 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.126900 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:15:31.127031 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.126930 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:15:31.127031 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.126946 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8rdb2\"" Apr 16 18:15:31.127031 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.126955 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:15:31.127283 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.127261 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:15:31.127479 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.127463 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:15:31.128015 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.128000 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-slcpk\"" Apr 16 18:15:31.143471 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.142950 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tphmr"] Apr 16 18:15:31.145931 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.145913 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:31.146047 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.146018 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:31.192694 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.192663 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:15:31.202929 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.202894 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d37a2e5-3988-4400-95f4-1baaf11b42a8-tmp-dir\") pod \"node-resolver-jl2lc\" (UID: \"3d37a2e5-3988-4400-95f4-1baaf11b42a8\") " pod="openshift-dns/node-resolver-jl2lc" Apr 16 18:15:31.203064 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.202950 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-conf-dir\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.203064 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.202979 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e12a06f-d736-4229-bb5a-3066805a1732-host-slash\") pod \"iptables-alerter-9rvl2\" (UID: \"5e12a06f-d736-4229-bb5a-3066805a1732\") " pod="openshift-network-operator/iptables-alerter-9rvl2" Apr 16 18:15:31.203064 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203008 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-sys\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.203064 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203048 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-var-lib-kubelet\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.203247 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203074 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-os-release\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.203247 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knjrs\" (UniqueName: \"kubernetes.io/projected/3fd8f3ce-1a67-4a38-99ec-e368aea03088-kube-api-access-knjrs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:31.203247 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203136 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62mxk\" (UniqueName: \"kubernetes.io/projected/5e12a06f-d736-4229-bb5a-3066805a1732-kube-api-access-62mxk\") pod \"iptables-alerter-9rvl2\" (UID: \"5e12a06f-d736-4229-bb5a-3066805a1732\") " pod="openshift-network-operator/iptables-alerter-9rvl2" Apr 16 18:15:31.203247 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203194 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-sysconfig\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.203454 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203305 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:31.203454 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203339 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-ovnkube-config\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.203454 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203369 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-lib-modules\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.203454 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bad1b589-c613-43a1-a8af-adf718b3865b-etc-tuned\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.203454 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203438 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5366b941-9d0a-4457-8229-086d574fc5ab-kubelet-config\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:31.203680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-cni-dir\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.203680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld489\" (UniqueName: \"kubernetes.io/projected/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-kube-api-access-ld489\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.203680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203653 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bad1b589-c613-43a1-a8af-adf718b3865b-tmp\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.203810 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e4ae6d82-301f-44be-85ce-8d3b88e0d6e1-serviceca\") pod \"node-ca-kd96x\" (UID: \"e4ae6d82-301f-44be-85ce-8d3b88e0d6e1\") " pod="openshift-image-registry/node-ca-kd96x" Apr 16 18:15:31.203810 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.203800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aec539b7-c282-46ac-8eff-3bb0c203088a-agent-certs\") pod \"konnectivity-agent-7v7nr\" (UID: \"aec539b7-c282-46ac-8eff-3bb0c203088a\") " pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-etc-selinux\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204196 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-system-cni-dir\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204220 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d3825245-f2b4-4372-9135-56f1b2145871-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204244 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-system-cni-dir\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204266 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-socket-dir-parent\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-run-multus-certs\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-systemd-units\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-env-overrides\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204373 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-registration-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-var-lib-kubelet\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-hostroot\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204484 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e4ae6d82-301f-44be-85ce-8d3b88e0d6e1-serviceca\") pod \"node-ca-kd96x\" (UID: \"e4ae6d82-301f-44be-85ce-8d3b88e0d6e1\") " pod="openshift-image-registry/node-ca-kd96x" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204492 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-slash\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-run-netns\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204559 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-node-log\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.204707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8n45\" (UniqueName: \"kubernetes.io/projected/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-kube-api-access-t8n45\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204600 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-kubernetes\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204620 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-var-lib-cni-bin\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204650 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204708 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-sys-fs\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbr2g\" (UniqueName: \"kubernetes.io/projected/bad1b589-c613-43a1-a8af-adf718b3865b-kube-api-access-dbr2g\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204782 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-os-release\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204813 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-run-netns\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204838 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnggx\" (UniqueName: \"kubernetes.io/projected/37634bfc-74ef-4ed7-916d-20e219934bbf-kube-api-access-cnggx\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204863 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-log-socket\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-ovn-node-metrics-cert\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aec539b7-c282-46ac-8eff-3bb0c203088a-konnectivity-ca\") pod \"konnectivity-agent-7v7nr\" (UID: \"aec539b7-c282-46ac-8eff-3bb0c203088a\") " pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204931 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zl6m\" (UniqueName: \"kubernetes.io/projected/3d37a2e5-3988-4400-95f4-1baaf11b42a8-kube-api-access-8zl6m\") pod \"node-resolver-jl2lc\" (UID: \"3d37a2e5-3988-4400-95f4-1baaf11b42a8\") " pod="openshift-dns/node-resolver-jl2lc" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.204980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37634bfc-74ef-4ed7-916d-20e219934bbf-cni-binary-copy\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5e12a06f-d736-4229-bb5a-3066805a1732-iptables-alerter-script\") pod \"iptables-alerter-9rvl2\" (UID: \"5e12a06f-d736-4229-bb5a-3066805a1732\") " pod="openshift-network-operator/iptables-alerter-9rvl2" Apr 16 18:15:31.205334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205033 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-sysctl-d\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-var-lib-openvswitch\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-run-ovn\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205103 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5366b941-9d0a-4457-8229-086d574fc5ab-dbus\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205125 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-cnibin\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205148 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-daemon-config\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205176 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkg4f\" (UniqueName: \"kubernetes.io/projected/e4ae6d82-301f-44be-85ce-8d3b88e0d6e1-kube-api-access-mkg4f\") pod \"node-ca-kd96x\" (UID: \"e4ae6d82-301f-44be-85ce-8d3b88e0d6e1\") " pod="openshift-image-registry/node-ca-kd96x" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205200 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-cni-netd\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205223 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-device-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205248 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-systemd\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205273 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwhqd\" (UniqueName: \"kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd\") pod \"network-check-target-4cbj8\" (UID: \"619bc9de-2915-4bce-b443-702d489e89af\") " pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d37a2e5-3988-4400-95f4-1baaf11b42a8-hosts-file\") pod \"node-resolver-jl2lc\" (UID: \"3d37a2e5-3988-4400-95f4-1baaf11b42a8\") " pod="openshift-dns/node-resolver-jl2lc" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205320 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-cnibin\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205347 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-etc-kubernetes\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205429 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-etc-openvswitch\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-cni-bin\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205477 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-run\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.206008 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205503 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-run-k8s-cni-cncf-io\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205544 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-var-lib-cni-multus\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205567 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205623 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-kubelet\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205647 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-run-openvswitch\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205671 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-sysctl-conf\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205693 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-host\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205717 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d3825245-f2b4-4372-9135-56f1b2145871-cni-binary-copy\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205740 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205777 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d3825245-f2b4-4372-9135-56f1b2145871-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205808 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7nbd\" (UniqueName: \"kubernetes.io/projected/d3825245-f2b4-4372-9135-56f1b2145871-kube-api-access-n7nbd\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4ae6d82-301f-44be-85ce-8d3b88e0d6e1-host\") pod \"node-ca-kd96x\" (UID: \"e4ae6d82-301f-44be-85ce-8d3b88e0d6e1\") " pod="openshift-image-registry/node-ca-kd96x" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205872 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-run-systemd\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-run-ovn-kubernetes\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205928 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4ae6d82-301f-44be-85ce-8d3b88e0d6e1-host\") pod \"node-ca-kd96x\" (UID: \"e4ae6d82-301f-44be-85ce-8d3b88e0d6e1\") " pod="openshift-image-registry/node-ca-kd96x" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-ovnkube-script-lib\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.206674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.205969 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-socket-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.207220 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.206006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-modprobe-d\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.210244 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.210213 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:10:30 +0000 UTC" deadline="2027-11-02 04:12:48.91661711 +0000 UTC" Apr 16 18:15:31.210342 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.210245 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13545h57m17.706375468s" Apr 16 18:15:31.221802 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.221750 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-205.ec2.internal" event={"ID":"f9f7c92f985c41749a276f7d4f6cddbd","Type":"ContainerStarted","Data":"86ee2785967ffb40e7a37be0f7bd9c3d133fd7fc7a98fdd550b7da71938572ee"} Apr 16 18:15:31.222907 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.222886 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" event={"ID":"7a34eecb8d6e5f391acffbd465bccedd","Type":"ContainerStarted","Data":"238186f8658f49937f95776314b9881ecd2cc956b4b01e8df81f9a0ef6703c32"} Apr 16 18:15:31.226534 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.226489 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:15:31.230345 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.230320 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkg4f\" (UniqueName: \"kubernetes.io/projected/e4ae6d82-301f-44be-85ce-8d3b88e0d6e1-kube-api-access-mkg4f\") pod \"node-ca-kd96x\" (UID: \"e4ae6d82-301f-44be-85ce-8d3b88e0d6e1\") " pod="openshift-image-registry/node-ca-kd96x" Apr 16 18:15:31.307197 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62mxk\" (UniqueName: \"kubernetes.io/projected/5e12a06f-d736-4229-bb5a-3066805a1732-kube-api-access-62mxk\") pod \"iptables-alerter-9rvl2\" (UID: \"5e12a06f-d736-4229-bb5a-3066805a1732\") " pod="openshift-network-operator/iptables-alerter-9rvl2" Apr 16 18:15:31.307356 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307203 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-sysconfig\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.307356 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307229 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:31.307356 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-ovnkube-config\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.307356 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-lib-modules\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.307356 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bad1b589-c613-43a1-a8af-adf718b3865b-etc-tuned\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.307356 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5366b941-9d0a-4457-8229-086d574fc5ab-kubelet-config\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:31.307356 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307343 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-cni-dir\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.307356 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307346 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-sysconfig\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.307765 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld489\" (UniqueName: \"kubernetes.io/projected/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-kube-api-access-ld489\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.307765 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.307381 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:31.307765 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bad1b589-c613-43a1-a8af-adf718b3865b-tmp\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.307765 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307437 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aec539b7-c282-46ac-8eff-3bb0c203088a-agent-certs\") pod \"konnectivity-agent-7v7nr\" (UID: \"aec539b7-c282-46ac-8eff-3bb0c203088a\") " pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:31.307765 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-lib-modules\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.307765 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307452 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5366b941-9d0a-4457-8229-086d574fc5ab-kubelet-config\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:31.307765 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.307485 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret podName:5366b941-9d0a-4457-8229-086d574fc5ab nodeName:}" failed. No retries permitted until 2026-04-16 18:15:31.807431231 +0000 UTC m=+3.163573284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret") pod "global-pull-secret-syncer-tphmr" (UID: "5366b941-9d0a-4457-8229-086d574fc5ab") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:31.307765 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307696 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-etc-selinux\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.307765 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-cni-dir\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.307765 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-system-cni-dir\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.307765 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307756 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d3825245-f2b4-4372-9135-56f1b2145871-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-system-cni-dir\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307806 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-etc-selinux\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307828 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-system-cni-dir\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307850 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-socket-dir-parent\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-socket-dir-parent\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307858 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-system-cni-dir\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307893 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-run-multus-certs\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307919 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-systemd-units\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307929 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-ovnkube-config\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-env-overrides\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307970 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-registration-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307979 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-run-multus-certs\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.307993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-var-lib-kubelet\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308017 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-hostroot\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308021 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-systemd-units\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308041 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-slash\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-run-netns\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.308242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308074 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-var-lib-kubelet\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308086 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-node-log\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308110 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8n45\" (UniqueName: \"kubernetes.io/projected/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-kube-api-access-t8n45\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-registration-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308136 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-kubernetes\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-var-lib-cni-bin\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-run-netns\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308186 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-hostroot\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308215 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-sys-fs\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbr2g\" (UniqueName: \"kubernetes.io/projected/bad1b589-c613-43a1-a8af-adf718b3865b-kube-api-access-dbr2g\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308262 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-sys-fs\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-os-release\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d3825245-f2b4-4372-9135-56f1b2145871-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308306 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-run-netns\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-node-log\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308360 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-run-netns\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.309030 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308380 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-os-release\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308306 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-env-overrides\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308394 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-slash\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308427 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-kubernetes\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308430 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnggx\" (UniqueName: \"kubernetes.io/projected/37634bfc-74ef-4ed7-916d-20e219934bbf-kube-api-access-cnggx\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308445 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-log-socket\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-var-lib-cni-bin\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-ovn-node-metrics-cert\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308498 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-log-socket\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308508 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aec539b7-c282-46ac-8eff-3bb0c203088a-konnectivity-ca\") pod \"konnectivity-agent-7v7nr\" (UID: \"aec539b7-c282-46ac-8eff-3bb0c203088a\") " pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308578 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zl6m\" (UniqueName: \"kubernetes.io/projected/3d37a2e5-3988-4400-95f4-1baaf11b42a8-kube-api-access-8zl6m\") pod \"node-resolver-jl2lc\" (UID: \"3d37a2e5-3988-4400-95f4-1baaf11b42a8\") " pod="openshift-dns/node-resolver-jl2lc" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308605 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37634bfc-74ef-4ed7-916d-20e219934bbf-cni-binary-copy\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5e12a06f-d736-4229-bb5a-3066805a1732-iptables-alerter-script\") pod \"iptables-alerter-9rvl2\" (UID: \"5e12a06f-d736-4229-bb5a-3066805a1732\") " pod="openshift-network-operator/iptables-alerter-9rvl2" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-sysctl-d\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308689 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-var-lib-openvswitch\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.309545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308714 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-run-ovn\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308735 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5366b941-9d0a-4457-8229-086d574fc5ab-dbus\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-cnibin\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308785 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-daemon-config\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308813 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-cni-netd\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308823 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-device-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308861 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-systemd\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308869 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-run-ovn\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308885 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhqd\" (UniqueName: \"kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd\") pod \"network-check-target-4cbj8\" (UID: \"619bc9de-2915-4bce-b443-702d489e89af\") " pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308910 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d37a2e5-3988-4400-95f4-1baaf11b42a8-hosts-file\") pod \"node-resolver-jl2lc\" (UID: \"3d37a2e5-3988-4400-95f4-1baaf11b42a8\") " pod="openshift-dns/node-resolver-jl2lc" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-cnibin\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-etc-kubernetes\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.308981 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-etc-openvswitch\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-cni-bin\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-run\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aec539b7-c282-46ac-8eff-3bb0c203088a-konnectivity-ca\") pod \"konnectivity-agent-7v7nr\" (UID: \"aec539b7-c282-46ac-8eff-3bb0c203088a\") " pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309053 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-run-k8s-cni-cncf-io\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.310337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309109 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-var-lib-cni-multus\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309102 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-cnibin\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309137 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-kubelet\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309185 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-run-openvswitch\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-sysctl-conf\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-host\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5366b941-9d0a-4457-8229-086d574fc5ab-dbus\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d3825245-f2b4-4372-9135-56f1b2145871-cni-binary-copy\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309353 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37634bfc-74ef-4ed7-916d-20e219934bbf-cni-binary-copy\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309697 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-daemon-config\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d3825245-f2b4-4372-9135-56f1b2145871-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nbd\" (UniqueName: \"kubernetes.io/projected/d3825245-f2b4-4372-9135-56f1b2145871-kube-api-access-n7nbd\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-run-systemd\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309840 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5e12a06f-d736-4229-bb5a-3066805a1732-iptables-alerter-script\") pod \"iptables-alerter-9rvl2\" (UID: \"5e12a06f-d736-4229-bb5a-3066805a1732\") " pod="openshift-network-operator/iptables-alerter-9rvl2" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309851 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-run-ovn-kubernetes\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.311180 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-ovnkube-script-lib\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-socket-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309966 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-sysctl-d\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.309975 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-modprobe-d\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d37a2e5-3988-4400-95f4-1baaf11b42a8-tmp-dir\") pod \"node-resolver-jl2lc\" (UID: \"3d37a2e5-3988-4400-95f4-1baaf11b42a8\") " pod="openshift-dns/node-resolver-jl2lc" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d3825245-f2b4-4372-9135-56f1b2145871-cni-binary-copy\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-conf-dir\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310054 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e12a06f-d736-4229-bb5a-3066805a1732-host-slash\") pod \"iptables-alerter-9rvl2\" (UID: \"5e12a06f-d736-4229-bb5a-3066805a1732\") " pod="openshift-network-operator/iptables-alerter-9rvl2" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310062 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-run-k8s-cni-cncf-io\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310094 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5e12a06f-d736-4229-bb5a-3066805a1732-host-slash\") pod \"iptables-alerter-9rvl2\" (UID: \"5e12a06f-d736-4229-bb5a-3066805a1732\") " pod="openshift-network-operator/iptables-alerter-9rvl2" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310103 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-host-var-lib-cni-multus\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310148 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-run-openvswitch\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-run-systemd\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310182 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-kubelet\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310192 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-etc-kubernetes\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310193 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-etc-openvswitch\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.310264 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310272 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-sysctl-conf\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.311846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310314 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-modprobe-d\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310318 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-host\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310320 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-socket-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.310322 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs podName:3fd8f3ce-1a67-4a38-99ec-e368aea03088 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:31.810305818 +0000 UTC m=+3.166447855 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs") pod "network-metrics-daemon-j9gkk" (UID: "3fd8f3ce-1a67-4a38-99ec-e368aea03088") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310010 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-var-lib-openvswitch\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-cni-netd\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310380 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-cnibin\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d37a2e5-3988-4400-95f4-1baaf11b42a8-hosts-file\") pod \"node-resolver-jl2lc\" (UID: \"3d37a2e5-3988-4400-95f4-1baaf11b42a8\") " pod="openshift-dns/node-resolver-jl2lc" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310433 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-etc-systemd\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310474 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-device-dir\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310602 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-multus-conf-dir\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310620 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d37a2e5-3988-4400-95f4-1baaf11b42a8-tmp-dir\") pod \"node-resolver-jl2lc\" (UID: \"3d37a2e5-3988-4400-95f4-1baaf11b42a8\") " pod="openshift-dns/node-resolver-jl2lc" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310645 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-run-ovn-kubernetes\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-host-cni-bin\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310719 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-run\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-sys\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-var-lib-kubelet\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.312661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310800 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-os-release\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.313446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310829 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knjrs\" (UniqueName: \"kubernetes.io/projected/3fd8f3ce-1a67-4a38-99ec-e368aea03088-kube-api-access-knjrs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:31.313446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310841 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d3825245-f2b4-4372-9135-56f1b2145871-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.313446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310901 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-var-lib-kubelet\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.313446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310910 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bad1b589-c613-43a1-a8af-adf718b3865b-sys\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.313446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37634bfc-74ef-4ed7-916d-20e219934bbf-os-release\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.313446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.310972 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-ovnkube-script-lib\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.313446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.311153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-ovn-node-metrics-cert\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.313446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.311220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d3825245-f2b4-4372-9135-56f1b2145871-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.313446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.311499 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aec539b7-c282-46ac-8eff-3bb0c203088a-agent-certs\") pod \"konnectivity-agent-7v7nr\" (UID: \"aec539b7-c282-46ac-8eff-3bb0c203088a\") " pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:31.313446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.312002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bad1b589-c613-43a1-a8af-adf718b3865b-tmp\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.313446 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.313384 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bad1b589-c613-43a1-a8af-adf718b3865b-etc-tuned\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.314034 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.313848 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:15:31.323914 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.323840 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62mxk\" (UniqueName: \"kubernetes.io/projected/5e12a06f-d736-4229-bb5a-3066805a1732-kube-api-access-62mxk\") pod \"iptables-alerter-9rvl2\" (UID: \"5e12a06f-d736-4229-bb5a-3066805a1732\") " pod="openshift-network-operator/iptables-alerter-9rvl2" Apr 16 18:15:31.326963 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.326942 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:31.327077 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.326970 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:31.327077 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.326984 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jwhqd for pod openshift-network-diagnostics/network-check-target-4cbj8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:31.327077 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.327052 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd podName:619bc9de-2915-4bce-b443-702d489e89af nodeName:}" failed. No retries permitted until 2026-04-16 18:15:31.827034514 +0000 UTC m=+3.183176568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jwhqd" (UniqueName: "kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd") pod "network-check-target-4cbj8" (UID: "619bc9de-2915-4bce-b443-702d489e89af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:31.327423 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.327404 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbr2g\" (UniqueName: \"kubernetes.io/projected/bad1b589-c613-43a1-a8af-adf718b3865b-kube-api-access-dbr2g\") pod \"tuned-69hrm\" (UID: \"bad1b589-c613-43a1-a8af-adf718b3865b\") " pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.328905 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.328881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zl6m\" (UniqueName: \"kubernetes.io/projected/3d37a2e5-3988-4400-95f4-1baaf11b42a8-kube-api-access-8zl6m\") pod \"node-resolver-jl2lc\" (UID: \"3d37a2e5-3988-4400-95f4-1baaf11b42a8\") " pod="openshift-dns/node-resolver-jl2lc" Apr 16 18:15:31.329078 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.329055 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7nbd\" (UniqueName: \"kubernetes.io/projected/d3825245-f2b4-4372-9135-56f1b2145871-kube-api-access-n7nbd\") pod \"multus-additional-cni-plugins-5xfmp\" (UID: \"d3825245-f2b4-4372-9135-56f1b2145871\") " pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.329590 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.329560 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8n45\" (UniqueName: \"kubernetes.io/projected/6bd3acc3-3980-4bf0-8ce5-830f127ac8cd-kube-api-access-t8n45\") pod \"aws-ebs-csi-driver-node-vhpkl\" (UID: \"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.330579 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.330557 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld489\" (UniqueName: \"kubernetes.io/projected/2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7-kube-api-access-ld489\") pod \"ovnkube-node-slsjs\" (UID: \"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.331059 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.331038 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnggx\" (UniqueName: \"kubernetes.io/projected/37634bfc-74ef-4ed7-916d-20e219934bbf-kube-api-access-cnggx\") pod \"multus-6sjm7\" (UID: \"37634bfc-74ef-4ed7-916d-20e219934bbf\") " pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.331596 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.331578 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knjrs\" (UniqueName: \"kubernetes.io/projected/3fd8f3ce-1a67-4a38-99ec-e368aea03088-kube-api-access-knjrs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:31.407205 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.407168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kd96x" Apr 16 18:15:31.414498 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.414475 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" Apr 16 18:15:31.429903 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.429882 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jl2lc" Apr 16 18:15:31.434530 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.434496 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" Apr 16 18:15:31.442071 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.442046 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9rvl2" Apr 16 18:15:31.448680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.448658 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6sjm7" Apr 16 18:15:31.455202 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.455183 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-69hrm" Apr 16 18:15:31.462719 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.462701 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:31.468264 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.468243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:31.814286 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.814245 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:31.814466 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.814347 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:31.814466 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.814396 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:31.814466 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.814448 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:31.814692 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.814469 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret podName:5366b941-9d0a-4457-8229-086d574fc5ab nodeName:}" failed. No retries permitted until 2026-04-16 18:15:32.814454689 +0000 UTC m=+4.170596726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret") pod "global-pull-secret-syncer-tphmr" (UID: "5366b941-9d0a-4457-8229-086d574fc5ab") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:31.814692 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.814499 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs podName:3fd8f3ce-1a67-4a38-99ec-e368aea03088 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:32.814481936 +0000 UTC m=+4.170623973 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs") pod "network-metrics-daemon-j9gkk" (UID: "3fd8f3ce-1a67-4a38-99ec-e368aea03088") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:31.914945 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:31.914912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhqd\" (UniqueName: \"kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd\") pod \"network-check-target-4cbj8\" (UID: \"619bc9de-2915-4bce-b443-702d489e89af\") " pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:31.915142 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.915067 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:31.915142 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.915086 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:31.915142 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.915099 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jwhqd for pod openshift-network-diagnostics/network-check-target-4cbj8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:31.915290 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:31.915151 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd podName:619bc9de-2915-4bce-b443-702d489e89af nodeName:}" failed. No retries permitted until 2026-04-16 18:15:32.915137759 +0000 UTC m=+4.271279812 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwhqd" (UniqueName: "kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd") pod "network-check-target-4cbj8" (UID: "619bc9de-2915-4bce-b443-702d489e89af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:31.989336 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:31.989311 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3825245_f2b4_4372_9135_56f1b2145871.slice/crio-ee1b5e139483d003d7fbfb5c37a34ef182e380103648ef9256e4bb57dacb2049 WatchSource:0}: Error finding container ee1b5e139483d003d7fbfb5c37a34ef182e380103648ef9256e4bb57dacb2049: Status 404 returned error can't find the container with id ee1b5e139483d003d7fbfb5c37a34ef182e380103648ef9256e4bb57dacb2049 Apr 16 18:15:32.027110 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:32.027083 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37634bfc_74ef_4ed7_916d_20e219934bbf.slice/crio-f08b64491b1acf5f3b1f1a11d3ce3d0062abe39c264461e4048cd8fcc0d38d87 WatchSource:0}: Error finding container f08b64491b1acf5f3b1f1a11d3ce3d0062abe39c264461e4048cd8fcc0d38d87: Status 404 returned error can't find the container with id f08b64491b1acf5f3b1f1a11d3ce3d0062abe39c264461e4048cd8fcc0d38d87 Apr 16 18:15:32.048673 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:32.048644 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bd3acc3_3980_4bf0_8ce5_830f127ac8cd.slice/crio-5dffe69807f763d23eb3f84005d8e6bf56fb956adaf3b04c3d39ab1d27343ac8 WatchSource:0}: Error finding container 5dffe69807f763d23eb3f84005d8e6bf56fb956adaf3b04c3d39ab1d27343ac8: Status 404 returned error can't find the container with id 5dffe69807f763d23eb3f84005d8e6bf56fb956adaf3b04c3d39ab1d27343ac8 Apr 16 18:15:32.049223 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:32.049201 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbad1b589_c613_43a1_a8af_adf718b3865b.slice/crio-ebfd05402e8c1011f7286143bb96380439656d1ba5d28104bcfd2df5910fed33 WatchSource:0}: Error finding container ebfd05402e8c1011f7286143bb96380439656d1ba5d28104bcfd2df5910fed33: Status 404 returned error can't find the container with id ebfd05402e8c1011f7286143bb96380439656d1ba5d28104bcfd2df5910fed33 Apr 16 18:15:32.052212 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:32.052190 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ae6d82_301f_44be_85ce_8d3b88e0d6e1.slice/crio-0bfd0fc94638645e8066894923d05994ab2e538416e70d1d6e4b2e781e6b0fc5 WatchSource:0}: Error finding container 0bfd0fc94638645e8066894923d05994ab2e538416e70d1d6e4b2e781e6b0fc5: Status 404 returned error can't find the container with id 0bfd0fc94638645e8066894923d05994ab2e538416e70d1d6e4b2e781e6b0fc5 Apr 16 18:15:32.056354 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:32.056327 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d37a2e5_3988_4400_95f4_1baaf11b42a8.slice/crio-7a7dd2c6e6702fe757336eff21cc78e12e5b75741f8b61cda29e06da55a9eec1 WatchSource:0}: Error finding container 7a7dd2c6e6702fe757336eff21cc78e12e5b75741f8b61cda29e06da55a9eec1: Status 404 returned error can't find the container with id 7a7dd2c6e6702fe757336eff21cc78e12e5b75741f8b61cda29e06da55a9eec1 Apr 16 18:15:32.057555 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:15:32.057491 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e12a06f_d736_4229_bb5a_3066805a1732.slice/crio-608da079446b1a458d095d129c9e15a947b75cb819c673a8df90c0c960576fb4 WatchSource:0}: Error finding container 608da079446b1a458d095d129c9e15a947b75cb819c673a8df90c0c960576fb4: Status 404 returned error can't find the container with id 608da079446b1a458d095d129c9e15a947b75cb819c673a8df90c0c960576fb4 Apr 16 18:15:32.211168 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.210997 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:10:30 +0000 UTC" deadline="2028-01-20 16:17:36.119489334 +0000 UTC" Apr 16 18:15:32.211168 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.211162 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15454h2m3.908329733s" Apr 16 18:15:32.225211 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.225181 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" event={"ID":"d3825245-f2b4-4372-9135-56f1b2145871","Type":"ContainerStarted","Data":"ee1b5e139483d003d7fbfb5c37a34ef182e380103648ef9256e4bb57dacb2049"} Apr 16 18:15:32.226240 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.226209 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" event={"ID":"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7","Type":"ContainerStarted","Data":"fde0d848868ac841285404dea3295d58f05f3c6e9be02e6afca0e3dd3e8415cb"} Apr 16 18:15:32.227104 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.227083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7v7nr" event={"ID":"aec539b7-c282-46ac-8eff-3bb0c203088a","Type":"ContainerStarted","Data":"c495ff5d06b5000a76005de56209d1ab631bcff14ac6fc658b83bff8db0af94e"} Apr 16 18:15:32.227890 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.227870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9rvl2" event={"ID":"5e12a06f-d736-4229-bb5a-3066805a1732","Type":"ContainerStarted","Data":"608da079446b1a458d095d129c9e15a947b75cb819c673a8df90c0c960576fb4"} Apr 16 18:15:32.228790 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.228773 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kd96x" event={"ID":"e4ae6d82-301f-44be-85ce-8d3b88e0d6e1","Type":"ContainerStarted","Data":"0bfd0fc94638645e8066894923d05994ab2e538416e70d1d6e4b2e781e6b0fc5"} Apr 16 18:15:32.229589 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.229569 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-69hrm" event={"ID":"bad1b589-c613-43a1-a8af-adf718b3865b","Type":"ContainerStarted","Data":"ebfd05402e8c1011f7286143bb96380439656d1ba5d28104bcfd2df5910fed33"} Apr 16 18:15:32.230481 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.230461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6sjm7" event={"ID":"37634bfc-74ef-4ed7-916d-20e219934bbf","Type":"ContainerStarted","Data":"f08b64491b1acf5f3b1f1a11d3ce3d0062abe39c264461e4048cd8fcc0d38d87"} Apr 16 18:15:32.231854 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.231833 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-205.ec2.internal" event={"ID":"f9f7c92f985c41749a276f7d4f6cddbd","Type":"ContainerStarted","Data":"3b1730d5a50911207171d033f119abe58cd3461ea736f335b9f9d3526484fa68"} Apr 16 18:15:32.233005 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.232974 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jl2lc" event={"ID":"3d37a2e5-3988-4400-95f4-1baaf11b42a8","Type":"ContainerStarted","Data":"7a7dd2c6e6702fe757336eff21cc78e12e5b75741f8b61cda29e06da55a9eec1"} Apr 16 18:15:32.233869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.233849 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" event={"ID":"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd","Type":"ContainerStarted","Data":"5dffe69807f763d23eb3f84005d8e6bf56fb956adaf3b04c3d39ab1d27343ac8"} Apr 16 18:15:32.249489 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.249453 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-205.ec2.internal" podStartSLOduration=2.249441269 podStartE2EDuration="2.249441269s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:15:32.249320733 +0000 UTC m=+3.605462790" watchObservedRunningTime="2026-04-16 18:15:32.249441269 +0000 UTC m=+3.605583325" Apr 16 18:15:32.820804 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.820763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:32.821055 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.820833 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:32.821055 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:32.820974 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:32.821055 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:32.821035 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret podName:5366b941-9d0a-4457-8229-086d574fc5ab nodeName:}" failed. No retries permitted until 2026-04-16 18:15:34.821018018 +0000 UTC m=+6.177160068 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret") pod "global-pull-secret-syncer-tphmr" (UID: "5366b941-9d0a-4457-8229-086d574fc5ab") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:32.821426 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:32.821405 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:32.821496 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:32.821460 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs podName:3fd8f3ce-1a67-4a38-99ec-e368aea03088 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:34.821444951 +0000 UTC m=+6.177586987 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs") pod "network-metrics-daemon-j9gkk" (UID: "3fd8f3ce-1a67-4a38-99ec-e368aea03088") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:32.921274 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:32.921242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhqd\" (UniqueName: \"kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd\") pod \"network-check-target-4cbj8\" (UID: \"619bc9de-2915-4bce-b443-702d489e89af\") " pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:32.921447 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:32.921410 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:32.921447 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:32.921426 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:32.921447 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:32.921438 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jwhqd for pod openshift-network-diagnostics/network-check-target-4cbj8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:32.921690 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:32.921488 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd podName:619bc9de-2915-4bce-b443-702d489e89af nodeName:}" failed. No retries permitted until 2026-04-16 18:15:34.921474879 +0000 UTC m=+6.277616927 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwhqd" (UniqueName: "kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd") pod "network-check-target-4cbj8" (UID: "619bc9de-2915-4bce-b443-702d489e89af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:33.217748 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:33.217663 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:33.218172 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:33.217798 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:33.218172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:33.217896 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:33.218172 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:33.217988 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:33.218172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:33.217663 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:33.218172 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:33.218086 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:33.255998 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:33.255965 2573 generic.go:358] "Generic (PLEG): container finished" podID="7a34eecb8d6e5f391acffbd465bccedd" containerID="1f2a90034f6960be712ccaa9a97813e882613f0da57dd7fff6dbb9c0a81ae9f7" exitCode=0 Apr 16 18:15:33.256596 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:33.256571 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" event={"ID":"7a34eecb8d6e5f391acffbd465bccedd","Type":"ContainerDied","Data":"1f2a90034f6960be712ccaa9a97813e882613f0da57dd7fff6dbb9c0a81ae9f7"} Apr 16 18:15:34.271480 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:34.270821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" event={"ID":"7a34eecb8d6e5f391acffbd465bccedd","Type":"ContainerStarted","Data":"27a8e46509e41e8a81b64b92770f21ec86a2794842a46e1b94e53c2ff6adcba8"} Apr 16 18:15:34.842373 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:34.841528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:34.842373 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:34.841595 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:34.842373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:34.841786 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:34.842373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:34.841851 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret podName:5366b941-9d0a-4457-8229-086d574fc5ab nodeName:}" failed. No retries permitted until 2026-04-16 18:15:38.841834124 +0000 UTC m=+10.197976162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret") pod "global-pull-secret-syncer-tphmr" (UID: "5366b941-9d0a-4457-8229-086d574fc5ab") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:34.842373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:34.842271 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:34.842373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:34.842325 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs podName:3fd8f3ce-1a67-4a38-99ec-e368aea03088 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:38.842309005 +0000 UTC m=+10.198451055 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs") pod "network-metrics-daemon-j9gkk" (UID: "3fd8f3ce-1a67-4a38-99ec-e368aea03088") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:34.942983 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:34.942855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhqd\" (UniqueName: \"kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd\") pod \"network-check-target-4cbj8\" (UID: \"619bc9de-2915-4bce-b443-702d489e89af\") " pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:34.943160 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:34.943044 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:34.943160 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:34.943069 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:34.943160 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:34.943082 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jwhqd for pod openshift-network-diagnostics/network-check-target-4cbj8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:34.943160 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:34.943146 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd podName:619bc9de-2915-4bce-b443-702d489e89af nodeName:}" failed. No retries permitted until 2026-04-16 18:15:38.943127271 +0000 UTC m=+10.299269317 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwhqd" (UniqueName: "kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd") pod "network-check-target-4cbj8" (UID: "619bc9de-2915-4bce-b443-702d489e89af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:35.217666 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:35.217631 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:35.217851 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:35.217640 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:35.217851 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:35.217749 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:35.217851 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:35.217814 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:35.217999 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:35.217913 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:35.218078 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:35.218034 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:37.219218 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:37.219185 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:37.219694 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:37.219315 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:37.219848 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:37.219827 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:37.219950 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:37.219932 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:37.220131 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:37.220010 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:37.220131 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:37.220082 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:38.877496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:38.877451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:38.877940 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:38.877537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:38.877940 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:38.877656 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:38.877940 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:38.877714 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret podName:5366b941-9d0a-4457-8229-086d574fc5ab nodeName:}" failed. No retries permitted until 2026-04-16 18:15:46.877700983 +0000 UTC m=+18.233843022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret") pod "global-pull-secret-syncer-tphmr" (UID: "5366b941-9d0a-4457-8229-086d574fc5ab") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:38.878075 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:38.878064 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:38.878114 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:38.878104 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs podName:3fd8f3ce-1a67-4a38-99ec-e368aea03088 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:46.878092584 +0000 UTC m=+18.234234623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs") pod "network-metrics-daemon-j9gkk" (UID: "3fd8f3ce-1a67-4a38-99ec-e368aea03088") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:38.978588 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:38.978012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhqd\" (UniqueName: \"kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd\") pod \"network-check-target-4cbj8\" (UID: \"619bc9de-2915-4bce-b443-702d489e89af\") " pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:38.978588 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:38.978204 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:38.978588 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:38.978219 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:38.978588 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:38.978229 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jwhqd for pod openshift-network-diagnostics/network-check-target-4cbj8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:38.978588 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:38.978283 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd podName:619bc9de-2915-4bce-b443-702d489e89af nodeName:}" failed. No retries permitted until 2026-04-16 18:15:46.978265635 +0000 UTC m=+18.334407670 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwhqd" (UniqueName: "kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd") pod "network-check-target-4cbj8" (UID: "619bc9de-2915-4bce-b443-702d489e89af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:39.218801 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:39.218718 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:39.218962 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:39.218846 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:39.219050 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:39.219027 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:39.219104 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:39.219087 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:39.219229 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:39.219175 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:39.219291 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:39.219233 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:41.217360 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:41.217320 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:41.217360 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:41.217349 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:41.217931 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:41.217453 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:41.217931 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:41.217476 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:41.217931 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:41.217664 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:41.217931 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:41.217756 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:43.217531 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:43.217485 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:43.217531 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:43.217532 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:43.218033 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:43.217621 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:43.218033 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:43.217673 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:43.218033 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:43.217736 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:43.218033 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:43.217839 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:45.217565 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:45.217460 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:45.218028 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:45.217600 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:45.218028 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:45.217653 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:45.218028 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:45.217770 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:45.218028 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:45.217801 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:45.218028 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:45.217873 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:46.937754 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:46.937710 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:46.938234 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:46.937777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:46.938234 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:46.937845 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:46.938234 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:46.937922 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:46.938234 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:46.937929 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs podName:3fd8f3ce-1a67-4a38-99ec-e368aea03088 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:02.937906395 +0000 UTC m=+34.294048433 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs") pod "network-metrics-daemon-j9gkk" (UID: "3fd8f3ce-1a67-4a38-99ec-e368aea03088") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:46.938234 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:46.937988 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret podName:5366b941-9d0a-4457-8229-086d574fc5ab nodeName:}" failed. No retries permitted until 2026-04-16 18:16:02.937971027 +0000 UTC m=+34.294113061 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret") pod "global-pull-secret-syncer-tphmr" (UID: "5366b941-9d0a-4457-8229-086d574fc5ab") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:47.039119 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:47.039082 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhqd\" (UniqueName: \"kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd\") pod \"network-check-target-4cbj8\" (UID: \"619bc9de-2915-4bce-b443-702d489e89af\") " pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:47.039271 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:47.039253 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:47.039335 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:47.039277 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:47.039335 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:47.039290 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jwhqd for pod openshift-network-diagnostics/network-check-target-4cbj8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:47.039403 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:47.039349 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd podName:619bc9de-2915-4bce-b443-702d489e89af nodeName:}" failed. No retries permitted until 2026-04-16 18:16:03.039332651 +0000 UTC m=+34.395474707 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwhqd" (UniqueName: "kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd") pod "network-check-target-4cbj8" (UID: "619bc9de-2915-4bce-b443-702d489e89af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:47.217886 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:47.217806 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:47.218043 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:47.217806 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:47.218043 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:47.217919 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:47.218147 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:47.218052 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:47.218147 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:47.217806 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:47.218242 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:47.218147 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:49.219678 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:49.218757 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:49.219678 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:49.218834 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:49.219678 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:49.218869 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:49.219678 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:49.218863 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:49.219678 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:49.218946 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:49.219678 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:49.219043 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:50.290599 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.290384 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:15:50.302239 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.302211 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jl2lc" event={"ID":"3d37a2e5-3988-4400-95f4-1baaf11b42a8","Type":"ContainerStarted","Data":"c2d73a078a199b0d6fbe1713fcbf032e004eccb59f608ff9d2a1f3042c8a2de3"} Apr 16 18:15:50.303652 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.303633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" event={"ID":"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd","Type":"ContainerStarted","Data":"5ee453722895eb44f9a2a621b896905fd140b5480e6d99a12add47531b795503"} Apr 16 18:15:50.303734 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.303658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" event={"ID":"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd","Type":"ContainerStarted","Data":"ebd3200b3c93c01d27e5dbae267f0e2baf7aed53408f4ddce616c5d7a5d8f86c"} Apr 16 18:15:50.304812 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.304789 2573 generic.go:358] "Generic (PLEG): container finished" podID="d3825245-f2b4-4372-9135-56f1b2145871" containerID="3b86bfa874d16095f84b5301cdd1480310e85d0dcb5dca9e9be9ad89092421ef" exitCode=0 Apr 16 18:15:50.304877 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.304849 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" event={"ID":"d3825245-f2b4-4372-9135-56f1b2145871","Type":"ContainerDied","Data":"3b86bfa874d16095f84b5301cdd1480310e85d0dcb5dca9e9be9ad89092421ef"} Apr 16 18:15:50.307556 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.307510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" event={"ID":"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7","Type":"ContainerStarted","Data":"dbfc75a65bb57a4678122e26cfff60900d018524f298601d9293da2b3a1f1a19"} Apr 16 18:15:50.307626 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.307560 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" event={"ID":"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7","Type":"ContainerStarted","Data":"67c864132ed20513f15a74b1c317686fb7a3e86d504251452a6d848430dda69c"} Apr 16 18:15:50.307626 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.307573 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" event={"ID":"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7","Type":"ContainerStarted","Data":"4d676c7f6e54887ff6051189bd5ca3fe7a3402e6c23b36376fbf1d142a1bc760"} Apr 16 18:15:50.307626 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.307585 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" event={"ID":"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7","Type":"ContainerStarted","Data":"2b09dc0997d072a70229bad0d93c4473c705f07f8431b913e97ebe9755c74d40"} Apr 16 18:15:50.307626 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.307596 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" event={"ID":"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7","Type":"ContainerStarted","Data":"1d1becf92a2477ce842d21f79fc29b76945fb64e609eca46d5a7df5029da7302"} Apr 16 18:15:50.307626 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.307608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" event={"ID":"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7","Type":"ContainerStarted","Data":"188b4fe12c1b7e2ace57843e86519ef2f188ead740acae6be6e4b984a3ee488c"} Apr 16 18:15:50.308693 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.308670 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7v7nr" event={"ID":"aec539b7-c282-46ac-8eff-3bb0c203088a","Type":"ContainerStarted","Data":"b32ffa2b0935fdba0abfb6eb24d87a9377354c6c0e7b8e9cd8d73bae288d0b9b"} Apr 16 18:15:50.309753 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.309732 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kd96x" event={"ID":"e4ae6d82-301f-44be-85ce-8d3b88e0d6e1","Type":"ContainerStarted","Data":"db013e746016f36d6cee4641fa76350fcc97d9aa9eac457cbad443d6589a3778"} Apr 16 18:15:50.310948 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.310933 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-69hrm" event={"ID":"bad1b589-c613-43a1-a8af-adf718b3865b","Type":"ContainerStarted","Data":"ad1a30be669839ec194b0bda5cd074a32dac25cca03472328ea2ace0c72610d1"} Apr 16 18:15:50.312109 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.312090 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6sjm7" event={"ID":"37634bfc-74ef-4ed7-916d-20e219934bbf","Type":"ContainerStarted","Data":"1031402717d5d40a7e1ee58c2ac2a54b9ea4cec5db0fc53867039d3cbfe68a44"} Apr 16 18:15:50.315048 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.315014 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-205.ec2.internal" podStartSLOduration=20.31500534 podStartE2EDuration="20.31500534s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:15:34.287338601 +0000 UTC m=+5.643480657" watchObservedRunningTime="2026-04-16 18:15:50.31500534 +0000 UTC m=+21.671147395" Apr 16 18:15:50.315403 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.315385 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jl2lc" podStartSLOduration=4.202389544 podStartE2EDuration="21.315381008s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.05809275 +0000 UTC m=+3.414234790" lastFinishedPulling="2026-04-16 18:15:49.171084216 +0000 UTC m=+20.527226254" observedRunningTime="2026-04-16 18:15:50.314925782 +0000 UTC m=+21.671067838" watchObservedRunningTime="2026-04-16 18:15:50.315381008 +0000 UTC m=+21.671523063" Apr 16 18:15:50.328228 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.328197 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6sjm7" podStartSLOduration=3.806026319 podStartE2EDuration="21.328187164s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.047995919 +0000 UTC m=+3.404137966" lastFinishedPulling="2026-04-16 18:15:49.570156771 +0000 UTC m=+20.926298811" observedRunningTime="2026-04-16 18:15:50.328080482 +0000 UTC m=+21.684222537" watchObservedRunningTime="2026-04-16 18:15:50.328187164 +0000 UTC m=+21.684329220" Apr 16 18:15:50.340801 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.340766 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kd96x" podStartSLOduration=8.489851937 podStartE2EDuration="21.340755869s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.054675063 +0000 UTC m=+3.410817097" lastFinishedPulling="2026-04-16 18:15:44.905578993 +0000 UTC m=+16.261721029" observedRunningTime="2026-04-16 18:15:50.340431528 +0000 UTC m=+21.696573607" watchObservedRunningTime="2026-04-16 18:15:50.340755869 +0000 UTC m=+21.696897925" Apr 16 18:15:50.358735 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.358702 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-69hrm" podStartSLOduration=4.2422221669999995 podStartE2EDuration="21.35869294s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.051105343 +0000 UTC m=+3.407247377" lastFinishedPulling="2026-04-16 18:15:49.167576114 +0000 UTC m=+20.523718150" observedRunningTime="2026-04-16 18:15:50.358086723 +0000 UTC m=+21.714228780" watchObservedRunningTime="2026-04-16 18:15:50.35869294 +0000 UTC m=+21.714834996" Apr 16 18:15:50.393603 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:50.393553 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7v7nr" podStartSLOduration=4.289354976 podStartE2EDuration="21.393538427s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.061601797 +0000 UTC m=+3.417743831" lastFinishedPulling="2026-04-16 18:15:49.165785238 +0000 UTC m=+20.521927282" observedRunningTime="2026-04-16 18:15:50.371115742 +0000 UTC m=+21.727257797" watchObservedRunningTime="2026-04-16 18:15:50.393538427 +0000 UTC m=+21.749680473" Apr 16 18:15:51.167001 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:51.166908 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:15:50.290595933Z","UUID":"ecb2409a-4895-4f5a-a5ee-fa87d17a6d02","Handler":null,"Name":"","Endpoint":""} Apr 16 18:15:51.169232 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:51.169210 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:15:51.169331 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:51.169241 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:15:51.217177 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:51.217143 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:51.217346 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:51.217272 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:51.217689 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:51.217668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:51.217781 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:51.217762 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:51.217862 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:51.217847 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:51.217943 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:51.217926 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:51.315992 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:51.315953 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" event={"ID":"6bd3acc3-3980-4bf0-8ce5-830f127ac8cd","Type":"ContainerStarted","Data":"13a4543d207df26b3d372bbf86eb0594f35b05a833d10737a1ea6198dfd435b6"} Apr 16 18:15:51.317235 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:51.317207 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9rvl2" event={"ID":"5e12a06f-d736-4229-bb5a-3066805a1732","Type":"ContainerStarted","Data":"a59c232081772444d82c726feac52e9d719d10dfa60820847c0b109e728f6229"} Apr 16 18:15:51.330157 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:51.330108 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vhpkl" podStartSLOduration=3.23802015 podStartE2EDuration="22.330093117s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.051189107 +0000 UTC m=+3.407331147" lastFinishedPulling="2026-04-16 18:15:51.143262066 +0000 UTC m=+22.499404114" observedRunningTime="2026-04-16 18:15:51.329922268 +0000 UTC m=+22.686064324" watchObservedRunningTime="2026-04-16 18:15:51.330093117 +0000 UTC m=+22.686235173" Apr 16 18:15:51.342719 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:51.342674 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9rvl2" podStartSLOduration=5.236482367 podStartE2EDuration="22.342660838s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.059819198 +0000 UTC m=+3.415961233" lastFinishedPulling="2026-04-16 18:15:49.165997655 +0000 UTC m=+20.522139704" observedRunningTime="2026-04-16 18:15:51.342483316 +0000 UTC m=+22.698625375" watchObservedRunningTime="2026-04-16 18:15:51.342660838 +0000 UTC m=+22.698802893" Apr 16 18:15:52.322829 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:52.322733 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" event={"ID":"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7","Type":"ContainerStarted","Data":"6c5e972ce3e6ba2eb71acd81abb020603814f61cf0a00510e848576b65268e42"} Apr 16 18:15:52.423037 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:52.423002 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:52.423697 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:52.423673 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:53.217970 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:53.217698 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:53.218160 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:53.217742 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:53.218160 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:53.218074 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:53.218160 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:53.218150 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:53.218325 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:53.217793 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:53.218325 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:53.218238 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:53.325324 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:53.325297 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:53.325878 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:53.325856 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7v7nr" Apr 16 18:15:55.217686 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.217658 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:55.218548 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.217657 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:55.218548 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:55.217790 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:55.218548 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:55.217838 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:55.218548 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.217665 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:55.218548 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:55.217896 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:55.330188 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.330154 2573 generic.go:358] "Generic (PLEG): container finished" podID="d3825245-f2b4-4372-9135-56f1b2145871" containerID="0c8a22458fc221d8b14806443e551fdbf7c430862794fa3dca0534d474be72c3" exitCode=0 Apr 16 18:15:55.330370 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.330233 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" event={"ID":"d3825245-f2b4-4372-9135-56f1b2145871","Type":"ContainerDied","Data":"0c8a22458fc221d8b14806443e551fdbf7c430862794fa3dca0534d474be72c3"} Apr 16 18:15:55.333324 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.333299 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" event={"ID":"2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7","Type":"ContainerStarted","Data":"fe6cfafae01db51fffc861118c997e945ff5a37c66f0b3525a47467f67023380"} Apr 16 18:15:55.333762 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.333741 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:55.333861 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.333767 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:55.333861 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.333777 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:55.349368 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.349345 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:55.349477 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.349409 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:15:55.391145 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:55.391098 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" podStartSLOduration=9.095628363 podStartE2EDuration="26.391084053s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.061902619 +0000 UTC m=+3.418044656" lastFinishedPulling="2026-04-16 18:15:49.357358311 +0000 UTC m=+20.713500346" observedRunningTime="2026-04-16 18:15:55.390302331 +0000 UTC m=+26.746444411" watchObservedRunningTime="2026-04-16 18:15:55.391084053 +0000 UTC m=+26.747226107" Apr 16 18:15:56.305585 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:56.305495 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4cbj8"] Apr 16 18:15:56.305918 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:56.305653 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:56.305918 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:56.305767 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:56.309913 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:56.309887 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tphmr"] Apr 16 18:15:56.310036 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:56.310003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:56.310124 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:56.310105 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:15:56.310505 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:56.310483 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j9gkk"] Apr 16 18:15:56.310632 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:56.310613 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:56.310740 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:56.310721 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:56.337771 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:56.337743 2573 generic.go:358] "Generic (PLEG): container finished" podID="d3825245-f2b4-4372-9135-56f1b2145871" containerID="86c15872c79c882f7fe3aaa5c52b65fdcc1bdf71e6a8bbb3246e4412b6b2a42e" exitCode=0 Apr 16 18:15:56.337916 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:56.337830 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" event={"ID":"d3825245-f2b4-4372-9135-56f1b2145871","Type":"ContainerDied","Data":"86c15872c79c882f7fe3aaa5c52b65fdcc1bdf71e6a8bbb3246e4412b6b2a42e"} Apr 16 18:15:57.341981 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:57.341950 2573 generic.go:358] "Generic (PLEG): container finished" podID="d3825245-f2b4-4372-9135-56f1b2145871" containerID="6f56a370682a7d73114b53416ad3aab48ba8cff61c22b6966d1d66f0fcc6089b" exitCode=0 Apr 16 18:15:57.342407 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:57.342030 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" event={"ID":"d3825245-f2b4-4372-9135-56f1b2145871","Type":"ContainerDied","Data":"6f56a370682a7d73114b53416ad3aab48ba8cff61c22b6966d1d66f0fcc6089b"} Apr 16 18:15:58.217065 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:58.217039 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:15:58.217172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:58.217071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:15:58.217172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:15:58.217071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:15:58.217273 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:58.217161 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:15:58.217273 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:58.217224 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:15:58.217357 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:15:58.217288 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:16:00.216944 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:00.216915 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:16:00.217391 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:00.216915 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:16:00.217391 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:00.217034 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:16:00.217391 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:00.217056 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:16:00.217391 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:00.217204 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:16:00.217391 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:00.217266 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:16:02.217529 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.217316 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:16:02.217955 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.217314 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:16:02.217955 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.217630 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:16:02.217955 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.217342 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:16:02.217955 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.217691 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4cbj8" podUID="619bc9de-2915-4bce-b443-702d489e89af" Apr 16 18:16:02.217955 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.217769 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tphmr" podUID="5366b941-9d0a-4457-8229-086d574fc5ab" Apr 16 18:16:02.418806 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.418775 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-205.ec2.internal" event="NodeReady" Apr 16 18:16:02.418978 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.418911 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:16:02.462870 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.462835 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg"] Apr 16 18:16:02.489572 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.489361 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7fff8cc5c4-dk27h"] Apr 16 18:16:02.489817 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.489728 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:02.492281 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.492261 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 18:16:02.492527 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.492497 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-97dhh\"" Apr 16 18:16:02.492787 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.492757 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 18:16:02.501669 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.501650 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg"] Apr 16 18:16:02.501669 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.501672 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bndvz"] Apr 16 18:16:02.501823 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.501782 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.504360 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.504341 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:16:02.504452 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.504369 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:16:02.504775 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.504755 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:16:02.504852 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.504825 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-846cr\"" Apr 16 18:16:02.523696 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.523670 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fff8cc5c4-dk27h"] Apr 16 18:16:02.523830 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.523702 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7vbzm"] Apr 16 18:16:02.524221 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.524200 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.527092 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.527072 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fxg25\"" Apr 16 18:16:02.528223 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.528197 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:16:02.529919 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.529902 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:16:02.541770 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.541753 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:16:02.542007 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.541992 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bndvz"] Apr 16 18:16:02.542051 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.542012 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7vbzm"] Apr 16 18:16:02.542101 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.542091 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:02.544749 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.544731 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:16:02.544846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.544787 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:16:02.545012 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.544998 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4scgb\"" Apr 16 18:16:02.546465 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.546446 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:16:02.659736 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.659702 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:02.659736 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.659736 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh7xt\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-kube-api-access-qh7xt\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.659933 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.659757 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0913fe98-7bbc-41d3-9144-086892d07104-config-volume\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.659933 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.659791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-image-registry-private-configuration\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.659933 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.659815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.659933 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.659831 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:02.659933 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.659845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjg62\" (UniqueName: \"kubernetes.io/projected/8d155535-fa59-4777-80cf-fdba34134958-kube-api-access-gjg62\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:02.659933 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.659863 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26ct8\" (UniqueName: \"kubernetes.io/projected/0913fe98-7bbc-41d3-9144-086892d07104-kube-api-access-26ct8\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.659933 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.659907 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-installation-pull-secrets\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.660124 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.659988 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.660124 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.660012 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-trusted-ca\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.660124 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.660035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0913fe98-7bbc-41d3-9144-086892d07104-tmp-dir\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.660124 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.660083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/71df2187-c914-4b58-8d61-6fcaacaefd11-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:02.660124 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.660105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-bound-sa-token\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.660262 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.660150 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/efc53255-caaf-4b68-9cd3-c6118907f500-ca-trust-extracted\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.660262 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.660166 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-registry-certificates\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.761113 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761037 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-image-registry-private-configuration\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.761113 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761084 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.761329 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.761180 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:16:02.761329 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.761191 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fff8cc5c4-dk27h: secret "image-registry-tls" not found Apr 16 18:16:02.761329 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:02.761329 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjg62\" (UniqueName: \"kubernetes.io/projected/8d155535-fa59-4777-80cf-fdba34134958-kube-api-access-gjg62\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:02.761329 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.761253 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls podName:efc53255-caaf-4b68-9cd3-c6118907f500 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:03.261236073 +0000 UTC m=+34.617378125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls") pod "image-registry-7fff8cc5c4-dk27h" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500") : secret "image-registry-tls" not found Apr 16 18:16:02.761329 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26ct8\" (UniqueName: \"kubernetes.io/projected/0913fe98-7bbc-41d3-9144-086892d07104-kube-api-access-26ct8\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.761329 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-installation-pull-secrets\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.761329 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761326 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.761333 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761346 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-trusted-ca\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761361 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0913fe98-7bbc-41d3-9144-086892d07104-tmp-dir\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.761386 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert podName:8d155535-fa59-4777-80cf-fdba34134958 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:03.261370376 +0000 UTC m=+34.617512443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert") pod "ingress-canary-7vbzm" (UID: "8d155535-fa59-4777-80cf-fdba34134958") : secret "canary-serving-cert" not found Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/71df2187-c914-4b58-8d61-6fcaacaefd11-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-bound-sa-token\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.761461 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.761534 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls podName:0913fe98-7bbc-41d3-9144-086892d07104 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:03.261500042 +0000 UTC m=+34.617642079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls") pod "dns-default-bndvz" (UID: "0913fe98-7bbc-41d3-9144-086892d07104") : secret "dns-default-metrics-tls" not found Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/efc53255-caaf-4b68-9cd3-c6118907f500-ca-trust-extracted\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-registry-certificates\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761658 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh7xt\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-kube-api-access-qh7xt\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0913fe98-7bbc-41d3-9144-086892d07104-tmp-dir\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.761682 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0913fe98-7bbc-41d3-9144-086892d07104-config-volume\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.761807 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.761789 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:16:02.762483 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.761854 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert podName:71df2187-c914-4b58-8d61-6fcaacaefd11 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:03.261830803 +0000 UTC m=+34.617972838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-h9xxg" (UID: "71df2187-c914-4b58-8d61-6fcaacaefd11") : secret "networking-console-plugin-cert" not found Apr 16 18:16:02.762483 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.762066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/efc53255-caaf-4b68-9cd3-c6118907f500-ca-trust-extracted\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.762483 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.762342 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0913fe98-7bbc-41d3-9144-086892d07104-config-volume\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.762483 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.762374 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/71df2187-c914-4b58-8d61-6fcaacaefd11-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:02.762483 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.762387 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-trusted-ca\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.762483 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.762412 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-registry-certificates\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.766332 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.766298 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-image-registry-private-configuration\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.766458 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.766341 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-installation-pull-secrets\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.770848 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.770822 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26ct8\" (UniqueName: \"kubernetes.io/projected/0913fe98-7bbc-41d3-9144-086892d07104-kube-api-access-26ct8\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:02.770975 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.770936 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh7xt\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-kube-api-access-qh7xt\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.771172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.771117 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-bound-sa-token\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:02.771402 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.771382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjg62\" (UniqueName: \"kubernetes.io/projected/8d155535-fa59-4777-80cf-fdba34134958-kube-api-access-gjg62\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:02.963848 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.963812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:16:02.964075 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:02.963940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:16:02.964075 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.963979 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:02.964075 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.964047 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:02.964075 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.964072 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret podName:5366b941-9d0a-4457-8229-086d574fc5ab nodeName:}" failed. No retries permitted until 2026-04-16 18:16:34.964050973 +0000 UTC m=+66.320193008 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret") pod "global-pull-secret-syncer-tphmr" (UID: "5366b941-9d0a-4457-8229-086d574fc5ab") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:16:02.964283 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:02.964099 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs podName:3fd8f3ce-1a67-4a38-99ec-e368aea03088 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:34.964081111 +0000 UTC m=+66.320223145 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs") pod "network-metrics-daemon-j9gkk" (UID: "3fd8f3ce-1a67-4a38-99ec-e368aea03088") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:03.064342 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:03.064255 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhqd\" (UniqueName: \"kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd\") pod \"network-check-target-4cbj8\" (UID: \"619bc9de-2915-4bce-b443-702d489e89af\") " pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:16:03.064502 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.064443 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:03.064502 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.064466 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:03.064502 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.064478 2573 projected.go:194] Error preparing data for projected volume kube-api-access-jwhqd for pod openshift-network-diagnostics/network-check-target-4cbj8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:03.064702 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.064558 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd podName:619bc9de-2915-4bce-b443-702d489e89af nodeName:}" failed. No retries permitted until 2026-04-16 18:16:35.064539226 +0000 UTC m=+66.420681276 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwhqd" (UniqueName: "kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd") pod "network-check-target-4cbj8" (UID: "619bc9de-2915-4bce-b443-702d489e89af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:03.266183 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:03.266149 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:03.266205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.266279 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.266289 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fff8cc5c4-dk27h: secret "image-registry-tls" not found Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.266287 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.266332 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls podName:efc53255-caaf-4b68-9cd3-c6118907f500 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:04.26631794 +0000 UTC m=+35.622459974 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls") pod "image-registry-7fff8cc5c4-dk27h" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500") : secret "image-registry-tls" not found Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.266358 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert podName:71df2187-c914-4b58-8d61-6fcaacaefd11 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:04.266341534 +0000 UTC m=+35.622483590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-h9xxg" (UID: "71df2187-c914-4b58-8d61-6fcaacaefd11") : secret "networking-console-plugin-cert" not found Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:03.266383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:03.266420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.266497 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.266529 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.266554 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert podName:8d155535-fa59-4777-80cf-fdba34134958 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:04.266543519 +0000 UTC m=+35.622685558 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert") pod "ingress-canary-7vbzm" (UID: "8d155535-fa59-4777-80cf-fdba34134958") : secret "canary-serving-cert" not found Apr 16 18:16:03.266570 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:03.266566 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls podName:0913fe98-7bbc-41d3-9144-086892d07104 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:04.266560728 +0000 UTC m=+35.622702762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls") pod "dns-default-bndvz" (UID: "0913fe98-7bbc-41d3-9144-086892d07104") : secret "dns-default-metrics-tls" not found Apr 16 18:16:04.217818 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.217773 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:16:04.217818 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.217802 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:16:04.218084 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.217834 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:16:04.221132 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.221108 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:16:04.221267 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.221106 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:16:04.221267 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.221149 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:16:04.221267 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.221113 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xl8bl\"" Apr 16 18:16:04.221267 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.221113 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:16:04.221267 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.221106 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4plx6\"" Apr 16 18:16:04.274368 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.274339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.274399 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.274426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:04.274470 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:04.274474 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:04.274482 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:04.274498 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fff8cc5c4-dk27h: secret "image-registry-tls" not found Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:04.274551 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert podName:71df2187-c914-4b58-8d61-6fcaacaefd11 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:06.274532718 +0000 UTC m=+37.630674753 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-h9xxg" (UID: "71df2187-c914-4b58-8d61-6fcaacaefd11") : secret "networking-console-plugin-cert" not found Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:04.274572 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:04.274573 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls podName:efc53255-caaf-4b68-9cd3-c6118907f500 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:06.274562731 +0000 UTC m=+37.630704768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls") pod "image-registry-7fff8cc5c4-dk27h" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500") : secret "image-registry-tls" not found Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:04.274617 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:04.274625 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert podName:8d155535-fa59-4777-80cf-fdba34134958 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:06.274609603 +0000 UTC m=+37.630751642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert") pod "ingress-canary-7vbzm" (UID: "8d155535-fa59-4777-80cf-fdba34134958") : secret "canary-serving-cert" not found Apr 16 18:16:04.274755 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:04.274659 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls podName:0913fe98-7bbc-41d3-9144-086892d07104 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:06.27464673 +0000 UTC m=+37.630788795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls") pod "dns-default-bndvz" (UID: "0913fe98-7bbc-41d3-9144-086892d07104") : secret "dns-default-metrics-tls" not found Apr 16 18:16:06.290891 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:06.290835 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:06.290891 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:06.290894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:06.291410 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:06.290936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:06.291410 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:06.290998 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:16:06.291410 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:06.291016 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fff8cc5c4-dk27h: secret "image-registry-tls" not found Apr 16 18:16:06.291410 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:06.291018 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:06.291410 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:06.291074 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls podName:efc53255-caaf-4b68-9cd3-c6118907f500 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:10.291055337 +0000 UTC m=+41.647197374 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls") pod "image-registry-7fff8cc5c4-dk27h" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500") : secret "image-registry-tls" not found Apr 16 18:16:06.291410 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:06.291130 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:16:06.291410 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:06.291166 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert podName:71df2187-c914-4b58-8d61-6fcaacaefd11 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:10.291155186 +0000 UTC m=+41.647297223 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-h9xxg" (UID: "71df2187-c914-4b58-8d61-6fcaacaefd11") : secret "networking-console-plugin-cert" not found Apr 16 18:16:06.291410 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:06.291212 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:06.291410 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:06.291237 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert podName:8d155535-fa59-4777-80cf-fdba34134958 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:10.291228604 +0000 UTC m=+41.647370638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert") pod "ingress-canary-7vbzm" (UID: "8d155535-fa59-4777-80cf-fdba34134958") : secret "canary-serving-cert" not found Apr 16 18:16:06.291410 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:06.291283 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:06.291410 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:06.291306 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls podName:0913fe98-7bbc-41d3-9144-086892d07104 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:10.291298385 +0000 UTC m=+41.647440424 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls") pod "dns-default-bndvz" (UID: "0913fe98-7bbc-41d3-9144-086892d07104") : secret "dns-default-metrics-tls" not found Apr 16 18:16:07.364044 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:07.364015 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" event={"ID":"d3825245-f2b4-4372-9135-56f1b2145871","Type":"ContainerStarted","Data":"1b793258bec5c9ca3d66e22a9485365c9dc6db3626582af133720b778118a819"} Apr 16 18:16:08.368982 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:08.368952 2573 generic.go:358] "Generic (PLEG): container finished" podID="d3825245-f2b4-4372-9135-56f1b2145871" containerID="1b793258bec5c9ca3d66e22a9485365c9dc6db3626582af133720b778118a819" exitCode=0 Apr 16 18:16:08.369374 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:08.369003 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" event={"ID":"d3825245-f2b4-4372-9135-56f1b2145871","Type":"ContainerDied","Data":"1b793258bec5c9ca3d66e22a9485365c9dc6db3626582af133720b778118a819"} Apr 16 18:16:09.373561 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:09.373509 2573 generic.go:358] "Generic (PLEG): container finished" podID="d3825245-f2b4-4372-9135-56f1b2145871" containerID="bb4537b911ddd445603af3fbb3f4e9dba11acefac01957d7d12c5beaa682aca4" exitCode=0 Apr 16 18:16:09.373561 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:09.373551 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" event={"ID":"d3825245-f2b4-4372-9135-56f1b2145871","Type":"ContainerDied","Data":"bb4537b911ddd445603af3fbb3f4e9dba11acefac01957d7d12c5beaa682aca4"} Apr 16 18:16:10.319106 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:10.319069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:10.319106 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:10.319105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:10.319336 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:10.319134 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:10.319336 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:10.319193 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:10.319336 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:10.319207 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:16:10.319336 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:10.319223 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fff8cc5c4-dk27h: secret "image-registry-tls" not found Apr 16 18:16:10.319336 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:10.319224 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:10.319336 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:10.319269 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls podName:efc53255-caaf-4b68-9cd3-c6118907f500 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:18.319254355 +0000 UTC m=+49.675396393 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls") pod "image-registry-7fff8cc5c4-dk27h" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500") : secret "image-registry-tls" not found Apr 16 18:16:10.319336 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:10.319282 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert podName:8d155535-fa59-4777-80cf-fdba34134958 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:18.319276312 +0000 UTC m=+49.675418347 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert") pod "ingress-canary-7vbzm" (UID: "8d155535-fa59-4777-80cf-fdba34134958") : secret "canary-serving-cert" not found Apr 16 18:16:10.319336 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:10.319293 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:16:10.319336 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:10.319306 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:10.319336 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:10.319339 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert podName:71df2187-c914-4b58-8d61-6fcaacaefd11 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:18.319322646 +0000 UTC m=+49.675464681 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-h9xxg" (UID: "71df2187-c914-4b58-8d61-6fcaacaefd11") : secret "networking-console-plugin-cert" not found Apr 16 18:16:10.319705 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:10.319357 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls podName:0913fe98-7bbc-41d3-9144-086892d07104 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:18.319344831 +0000 UTC m=+49.675486866 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls") pod "dns-default-bndvz" (UID: "0913fe98-7bbc-41d3-9144-086892d07104") : secret "dns-default-metrics-tls" not found Apr 16 18:16:10.378835 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:10.378798 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" event={"ID":"d3825245-f2b4-4372-9135-56f1b2145871","Type":"ContainerStarted","Data":"e373c99b207cb165d2ba967ba989f2fc26f3f3ace8f659d24acc134dcc003460"} Apr 16 18:16:10.401661 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:10.401613 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5xfmp" podStartSLOduration=6.19244516 podStartE2EDuration="41.401562865s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="2026-04-16 18:15:31.991436865 +0000 UTC m=+3.347578899" lastFinishedPulling="2026-04-16 18:16:07.200554567 +0000 UTC m=+38.556696604" observedRunningTime="2026-04-16 18:16:10.400492463 +0000 UTC m=+41.756634518" watchObservedRunningTime="2026-04-16 18:16:10.401562865 +0000 UTC m=+41.757704919" Apr 16 18:16:18.375939 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:18.375897 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:18.375957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:18.375977 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:18.376004 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:18.376052 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:18.376070 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fff8cc5c4-dk27h: secret "image-registry-tls" not found Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:18.376105 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:18.376123 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls podName:efc53255-caaf-4b68-9cd3-c6118907f500 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:34.37610776 +0000 UTC m=+65.732249794 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls") pod "image-registry-7fff8cc5c4-dk27h" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500") : secret "image-registry-tls" not found Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:18.376138 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:18.376050 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:18.376144 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls podName:0913fe98-7bbc-41d3-9144-086892d07104 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:34.376132728 +0000 UTC m=+65.732274762 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls") pod "dns-default-bndvz" (UID: "0913fe98-7bbc-41d3-9144-086892d07104") : secret "dns-default-metrics-tls" not found Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:18.376219 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert podName:8d155535-fa59-4777-80cf-fdba34134958 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:34.376204794 +0000 UTC m=+65.732346828 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert") pod "ingress-canary-7vbzm" (UID: "8d155535-fa59-4777-80cf-fdba34134958") : secret "canary-serving-cert" not found Apr 16 18:16:18.376373 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:18.376231 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert podName:71df2187-c914-4b58-8d61-6fcaacaefd11 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:34.376223511 +0000 UTC m=+65.732365545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-h9xxg" (UID: "71df2187-c914-4b58-8d61-6fcaacaefd11") : secret "networking-console-plugin-cert" not found Apr 16 18:16:27.355106 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:27.355077 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-slsjs" Apr 16 18:16:34.394219 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:34.394179 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:16:34.394219 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:34.394219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:16:34.394754 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:34.394247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:16:34.394754 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:34.394303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:16:34.394754 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:34.394321 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:16:34.394754 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:34.394344 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fff8cc5c4-dk27h: secret "image-registry-tls" not found Apr 16 18:16:34.394754 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:34.394392 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:16:34.394754 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:34.394398 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:34.394754 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:34.394408 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls podName:efc53255-caaf-4b68-9cd3-c6118907f500 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:06.394385843 +0000 UTC m=+97.750527897 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls") pod "image-registry-7fff8cc5c4-dk27h" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500") : secret "image-registry-tls" not found Apr 16 18:16:34.394754 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:34.394414 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:34.394754 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:34.394448 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert podName:8d155535-fa59-4777-80cf-fdba34134958 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:06.394431528 +0000 UTC m=+97.750573568 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert") pod "ingress-canary-7vbzm" (UID: "8d155535-fa59-4777-80cf-fdba34134958") : secret "canary-serving-cert" not found Apr 16 18:16:34.394754 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:34.394469 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert podName:71df2187-c914-4b58-8d61-6fcaacaefd11 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:06.394458823 +0000 UTC m=+97.750600865 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-h9xxg" (UID: "71df2187-c914-4b58-8d61-6fcaacaefd11") : secret "networking-console-plugin-cert" not found Apr 16 18:16:34.394754 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:34.394492 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls podName:0913fe98-7bbc-41d3-9144-086892d07104 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:06.394483514 +0000 UTC m=+97.750625548 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls") pod "dns-default-bndvz" (UID: "0913fe98-7bbc-41d3-9144-086892d07104") : secret "dns-default-metrics-tls" not found Apr 16 18:16:34.998933 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:34.998892 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:16:34.999137 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:34.998960 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:16:35.001574 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.001558 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:16:35.001633 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.001595 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:16:35.010058 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:35.010039 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:16:35.010146 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:16:35.010095 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs podName:3fd8f3ce-1a67-4a38-99ec-e368aea03088 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:39.010080912 +0000 UTC m=+130.366222947 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs") pod "network-metrics-daemon-j9gkk" (UID: "3fd8f3ce-1a67-4a38-99ec-e368aea03088") : secret "metrics-daemon-secret" not found Apr 16 18:16:35.012406 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.012386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5366b941-9d0a-4457-8229-086d574fc5ab-original-pull-secret\") pod \"global-pull-secret-syncer-tphmr\" (UID: \"5366b941-9d0a-4457-8229-086d574fc5ab\") " pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:16:35.099395 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.099359 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhqd\" (UniqueName: \"kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd\") pod \"network-check-target-4cbj8\" (UID: \"619bc9de-2915-4bce-b443-702d489e89af\") " pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:16:35.101868 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.101848 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:16:35.111277 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.111259 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:16:35.122272 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.122249 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwhqd\" (UniqueName: \"kubernetes.io/projected/619bc9de-2915-4bce-b443-702d489e89af-kube-api-access-jwhqd\") pod \"network-check-target-4cbj8\" (UID: \"619bc9de-2915-4bce-b443-702d489e89af\") " pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:16:35.129941 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.129923 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xl8bl\"" Apr 16 18:16:35.133561 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.133549 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tphmr" Apr 16 18:16:35.138172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.138140 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:16:35.289211 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.289158 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tphmr"] Apr 16 18:16:35.294095 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:16:35.294069 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5366b941_9d0a_4457_8229_086d574fc5ab.slice/crio-fdac1a6fc90011a757c8536aba3bbc405c286d551c214d989cd1091d48c4d430 WatchSource:0}: Error finding container fdac1a6fc90011a757c8536aba3bbc405c286d551c214d989cd1091d48c4d430: Status 404 returned error can't find the container with id fdac1a6fc90011a757c8536aba3bbc405c286d551c214d989cd1091d48c4d430 Apr 16 18:16:35.308133 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.308110 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4cbj8"] Apr 16 18:16:35.311258 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:16:35.311231 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod619bc9de_2915_4bce_b443_702d489e89af.slice/crio-2787525a9d4f300b2fd5ac33e202c2e23dc7a18e61481a8aaded1ed253ef4ba5 WatchSource:0}: Error finding container 2787525a9d4f300b2fd5ac33e202c2e23dc7a18e61481a8aaded1ed253ef4ba5: Status 404 returned error can't find the container with id 2787525a9d4f300b2fd5ac33e202c2e23dc7a18e61481a8aaded1ed253ef4ba5 Apr 16 18:16:35.432729 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.432689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tphmr" event={"ID":"5366b941-9d0a-4457-8229-086d574fc5ab","Type":"ContainerStarted","Data":"fdac1a6fc90011a757c8536aba3bbc405c286d551c214d989cd1091d48c4d430"} Apr 16 18:16:35.433809 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:35.433786 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4cbj8" event={"ID":"619bc9de-2915-4bce-b443-702d489e89af","Type":"ContainerStarted","Data":"2787525a9d4f300b2fd5ac33e202c2e23dc7a18e61481a8aaded1ed253ef4ba5"} Apr 16 18:16:40.444715 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:40.444671 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tphmr" event={"ID":"5366b941-9d0a-4457-8229-086d574fc5ab","Type":"ContainerStarted","Data":"f5f395874d10803d2594a57889a398f651af227e01d8e85883bb921e6f768d7c"} Apr 16 18:16:40.445914 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:40.445891 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4cbj8" event={"ID":"619bc9de-2915-4bce-b443-702d489e89af","Type":"ContainerStarted","Data":"bc8e7c6c7d0303ae947dddb626818349ed0724e6917a17cf24277bc3b956f132"} Apr 16 18:16:40.446056 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:40.446007 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:16:40.461720 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:40.461675 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tphmr" podStartSLOduration=64.758794819 podStartE2EDuration="1m9.461660243s" podCreationTimestamp="2026-04-16 18:15:31 +0000 UTC" firstStartedPulling="2026-04-16 18:16:35.295587066 +0000 UTC m=+66.651729100" lastFinishedPulling="2026-04-16 18:16:39.99845247 +0000 UTC m=+71.354594524" observedRunningTime="2026-04-16 18:16:40.460880039 +0000 UTC m=+71.817022094" watchObservedRunningTime="2026-04-16 18:16:40.461660243 +0000 UTC m=+71.817802300" Apr 16 18:16:40.475002 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:16:40.474942 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4cbj8" podStartSLOduration=66.799065068 podStartE2EDuration="1m11.474923472s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="2026-04-16 18:16:35.312966057 +0000 UTC m=+66.669108091" lastFinishedPulling="2026-04-16 18:16:39.98882446 +0000 UTC m=+71.344966495" observedRunningTime="2026-04-16 18:16:40.474475183 +0000 UTC m=+71.830617240" watchObservedRunningTime="2026-04-16 18:16:40.474923472 +0000 UTC m=+71.831065529" Apr 16 18:17:06.445688 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:06.445630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:17:06.445688 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:06.445697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:17:06.446093 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:06.445716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:17:06.446093 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:06.445741 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:17:06.446093 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:06.445776 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:17:06.446093 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:06.445806 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:06.446093 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:06.445804 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:17:06.446093 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:06.445826 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fff8cc5c4-dk27h: secret "image-registry-tls" not found Apr 16 18:17:06.446093 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:06.445858 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls podName:0913fe98-7bbc-41d3-9144-086892d07104 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:10.445845954 +0000 UTC m=+161.801987988 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls") pod "dns-default-bndvz" (UID: "0913fe98-7bbc-41d3-9144-086892d07104") : secret "dns-default-metrics-tls" not found Apr 16 18:17:06.446093 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:06.445866 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:06.446093 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:06.445870 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert podName:71df2187-c914-4b58-8d61-6fcaacaefd11 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:10.445865346 +0000 UTC m=+161.802007380 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-h9xxg" (UID: "71df2187-c914-4b58-8d61-6fcaacaefd11") : secret "networking-console-plugin-cert" not found Apr 16 18:17:06.446093 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:06.445958 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls podName:efc53255-caaf-4b68-9cd3-c6118907f500 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:10.445915719 +0000 UTC m=+161.802057758 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls") pod "image-registry-7fff8cc5c4-dk27h" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500") : secret "image-registry-tls" not found Apr 16 18:17:06.446093 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:06.445976 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert podName:8d155535-fa59-4777-80cf-fdba34134958 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:10.445968494 +0000 UTC m=+161.802110528 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert") pod "ingress-canary-7vbzm" (UID: "8d155535-fa59-4777-80cf-fdba34134958") : secret "canary-serving-cert" not found Apr 16 18:17:11.449791 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:11.449763 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4cbj8" Apr 16 18:17:39.079061 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:39.079024 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:17:39.079460 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:39.079140 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:17:39.079460 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:39.079201 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs podName:3fd8f3ce-1a67-4a38-99ec-e368aea03088 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:41.079186807 +0000 UTC m=+252.435328841 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs") pod "network-metrics-daemon-j9gkk" (UID: "3fd8f3ce-1a67-4a38-99ec-e368aea03088") : secret "metrics-daemon-secret" not found Apr 16 18:17:46.156282 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.156245 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw"] Apr 16 18:17:46.160077 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.160047 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:46.163535 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.163492 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:17:46.163797 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.163776 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:17:46.163797 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.163791 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:17:46.163967 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.163835 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-rrz9g\"" Apr 16 18:17:46.163967 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.163835 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:17:46.167246 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.167228 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw"] Apr 16 18:17:46.234767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.234730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:46.234941 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.234788 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/eff34aa9-7480-480d-b76a-e58afdd3fc46-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:46.234941 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.234839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6snm\" (UniqueName: \"kubernetes.io/projected/eff34aa9-7480-480d-b76a-e58afdd3fc46-kube-api-access-r6snm\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:46.259164 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.259139 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-kjrt4"] Apr 16 18:17:46.261681 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.261664 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-596bbb7b7d-nk24r"] Apr 16 18:17:46.261831 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.261816 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.264090 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.264057 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:17:46.264205 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.264153 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.264293 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.264275 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-9smt9\"" Apr 16 18:17:46.264551 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.264533 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:17:46.264653 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.264636 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:17:46.264732 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.264667 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:17:46.266256 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.266239 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:17:46.266356 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.266339 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:17:46.266423 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.266369 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:17:46.266423 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.266377 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:17:46.266537 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.266451 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:17:46.266680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.266664 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:17:46.266996 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.266978 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-l8wz9\"" Apr 16 18:17:46.270935 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.270915 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:17:46.274284 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.274263 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-kjrt4"] Apr 16 18:17:46.277279 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.277255 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-596bbb7b7d-nk24r"] Apr 16 18:17:46.335716 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.335690 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3901ff32-acaf-4296-9b6e-811ec88ce688-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.335879 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.335721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.335879 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.335753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn2fb\" (UniqueName: \"kubernetes.io/projected/23253796-d542-44e0-93b4-6b1d65c09948-kube-api-access-sn2fb\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.335879 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.335834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-stats-auth\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.335879 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.335868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:46.336037 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.335901 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.336037 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:46.335979 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:17:46.336037 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.335993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6snm\" (UniqueName: \"kubernetes.io/projected/eff34aa9-7480-480d-b76a-e58afdd3fc46-kube-api-access-r6snm\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:46.336037 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.336018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-default-certificate\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.336037 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:46.336036 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls podName:eff34aa9-7480-480d-b76a-e58afdd3fc46 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:46.836020796 +0000 UTC m=+138.192162831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4szbw" (UID: "eff34aa9-7480-480d-b76a-e58afdd3fc46") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:17:46.336267 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.336102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75nbr\" (UniqueName: \"kubernetes.io/projected/3901ff32-acaf-4296-9b6e-811ec88ce688-kube-api-access-75nbr\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.336267 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.336139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3901ff32-acaf-4296-9b6e-811ec88ce688-serving-cert\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.336267 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.336167 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/3901ff32-acaf-4296-9b6e-811ec88ce688-snapshots\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.336267 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.336194 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3901ff32-acaf-4296-9b6e-811ec88ce688-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.336267 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.336250 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3901ff32-acaf-4296-9b6e-811ec88ce688-tmp\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.336494 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.336289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/eff34aa9-7480-480d-b76a-e58afdd3fc46-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:46.336967 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.336944 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/eff34aa9-7480-480d-b76a-e58afdd3fc46-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:46.345983 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.345964 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6snm\" (UniqueName: \"kubernetes.io/projected/eff34aa9-7480-480d-b76a-e58afdd3fc46-kube-api-access-r6snm\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:46.436945 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.436821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-stats-auth\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.436945 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.436886 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.436945 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.436926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-default-certificate\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.437254 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.436954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75nbr\" (UniqueName: \"kubernetes.io/projected/3901ff32-acaf-4296-9b6e-811ec88ce688-kube-api-access-75nbr\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.437254 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:46.436966 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:17:46.437254 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.436976 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3901ff32-acaf-4296-9b6e-811ec88ce688-serving-cert\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.437254 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.436999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/3901ff32-acaf-4296-9b6e-811ec88ce688-snapshots\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.437254 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:46.437026 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:46.937007148 +0000 UTC m=+138.293149183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : secret "router-metrics-certs-default" not found Apr 16 18:17:46.437254 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.437065 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3901ff32-acaf-4296-9b6e-811ec88ce688-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.437254 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.437102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3901ff32-acaf-4296-9b6e-811ec88ce688-tmp\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.437254 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.437143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3901ff32-acaf-4296-9b6e-811ec88ce688-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.437254 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.437172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.437254 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.437215 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn2fb\" (UniqueName: \"kubernetes.io/projected/23253796-d542-44e0-93b4-6b1d65c09948-kube-api-access-sn2fb\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.437759 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:46.437592 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:46.937575593 +0000 UTC m=+138.293717633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : configmap references non-existent config key: service-ca.crt Apr 16 18:17:46.437759 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.437702 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3901ff32-acaf-4296-9b6e-811ec88ce688-tmp\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.437940 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.437917 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3901ff32-acaf-4296-9b6e-811ec88ce688-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.438122 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.438098 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/3901ff32-acaf-4296-9b6e-811ec88ce688-snapshots\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.438250 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.438234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3901ff32-acaf-4296-9b6e-811ec88ce688-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.439473 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.439454 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3901ff32-acaf-4296-9b6e-811ec88ce688-serving-cert\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.439895 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.439879 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-default-certificate\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.439963 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.439885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-stats-auth\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.447309 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.447288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn2fb\" (UniqueName: \"kubernetes.io/projected/23253796-d542-44e0-93b4-6b1d65c09948-kube-api-access-sn2fb\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.447829 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.447807 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75nbr\" (UniqueName: \"kubernetes.io/projected/3901ff32-acaf-4296-9b6e-811ec88ce688-kube-api-access-75nbr\") pod \"insights-operator-5785d4fcdd-kjrt4\" (UID: \"3901ff32-acaf-4296-9b6e-811ec88ce688\") " pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.573184 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.573153 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" Apr 16 18:17:46.690027 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.689954 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-kjrt4"] Apr 16 18:17:46.692883 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:17:46.692853 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3901ff32_acaf_4296_9b6e_811ec88ce688.slice/crio-9bf0725cf7df4811fa1cb36a370317bcf2e44ca85aef03ea933f68c62fd17f9c WatchSource:0}: Error finding container 9bf0725cf7df4811fa1cb36a370317bcf2e44ca85aef03ea933f68c62fd17f9c: Status 404 returned error can't find the container with id 9bf0725cf7df4811fa1cb36a370317bcf2e44ca85aef03ea933f68c62fd17f9c Apr 16 18:17:46.841477 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.841433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:46.841689 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:46.841603 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:17:46.841741 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:46.841689 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls podName:eff34aa9-7480-480d-b76a-e58afdd3fc46 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:47.841672792 +0000 UTC m=+139.197814827 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4szbw" (UID: "eff34aa9-7480-480d-b76a-e58afdd3fc46") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:17:46.942955 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.942866 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.942955 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:46.942953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:46.943144 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:46.943035 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:17:46.943144 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:46.943031 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:47.943013901 +0000 UTC m=+139.299155937 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : configmap references non-existent config key: service-ca.crt Apr 16 18:17:46.943144 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:46.943073 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:47.943062936 +0000 UTC m=+139.299204971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : secret "router-metrics-certs-default" not found Apr 16 18:17:47.573602 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:47.573566 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" event={"ID":"3901ff32-acaf-4296-9b6e-811ec88ce688","Type":"ContainerStarted","Data":"9bf0725cf7df4811fa1cb36a370317bcf2e44ca85aef03ea933f68c62fd17f9c"} Apr 16 18:17:47.850913 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:47.850814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:47.851085 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:47.850974 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:17:47.851085 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:47.851058 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls podName:eff34aa9-7480-480d-b76a-e58afdd3fc46 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:49.851037424 +0000 UTC m=+141.207179462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4szbw" (UID: "eff34aa9-7480-480d-b76a-e58afdd3fc46") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:17:47.952234 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:47.952189 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:47.952411 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:47.952297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:47.952411 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:47.952361 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:17:47.952537 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:47.952445 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:49.952421982 +0000 UTC m=+141.308564017 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : secret "router-metrics-certs-default" not found Apr 16 18:17:47.952537 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:47.952469 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:49.952453371 +0000 UTC m=+141.308595428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : configmap references non-existent config key: service-ca.crt Apr 16 18:17:48.576535 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:48.576432 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" event={"ID":"3901ff32-acaf-4296-9b6e-811ec88ce688","Type":"ContainerStarted","Data":"8ab45cc2d4a989b9a6e28d22ee4bcdf4ea216207a10a688bba587348121cf7a3"} Apr 16 18:17:49.166260 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.166205 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" podStartSLOduration=1.602830874 podStartE2EDuration="3.166183866s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:17:46.694585485 +0000 UTC m=+138.050727520" lastFinishedPulling="2026-04-16 18:17:48.257938477 +0000 UTC m=+139.614080512" observedRunningTime="2026-04-16 18:17:48.593633629 +0000 UTC m=+139.949775697" watchObservedRunningTime="2026-04-16 18:17:49.166183866 +0000 UTC m=+140.522325923" Apr 16 18:17:49.167161 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.167134 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-ht7hf"] Apr 16 18:17:49.170077 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.170057 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-ht7hf" Apr 16 18:17:49.172255 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.172235 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:49.172342 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.172268 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-gmmrs\"" Apr 16 18:17:49.172342 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.172236 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:49.182863 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.180735 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-ht7hf"] Apr 16 18:17:49.263872 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.263838 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4dps\" (UniqueName: \"kubernetes.io/projected/194cfdef-87c9-4fe6-b52f-1a569ef6e306-kube-api-access-z4dps\") pod \"volume-data-source-validator-7d955d5dd4-ht7hf\" (UID: \"194cfdef-87c9-4fe6-b52f-1a569ef6e306\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-ht7hf" Apr 16 18:17:49.364900 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.364866 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4dps\" (UniqueName: \"kubernetes.io/projected/194cfdef-87c9-4fe6-b52f-1a569ef6e306-kube-api-access-z4dps\") pod \"volume-data-source-validator-7d955d5dd4-ht7hf\" (UID: \"194cfdef-87c9-4fe6-b52f-1a569ef6e306\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-ht7hf" Apr 16 18:17:49.374509 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.374481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4dps\" (UniqueName: \"kubernetes.io/projected/194cfdef-87c9-4fe6-b52f-1a569ef6e306-kube-api-access-z4dps\") pod \"volume-data-source-validator-7d955d5dd4-ht7hf\" (UID: \"194cfdef-87c9-4fe6-b52f-1a569ef6e306\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-ht7hf" Apr 16 18:17:49.480266 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.480178 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-ht7hf" Apr 16 18:17:49.603452 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.603422 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-ht7hf"] Apr 16 18:17:49.607205 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:17:49.607175 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194cfdef_87c9_4fe6_b52f_1a569ef6e306.slice/crio-4ddcdef965ede0e947c882a4c50998b9eb81424ccc8c3c805e11a528595c93d1 WatchSource:0}: Error finding container 4ddcdef965ede0e947c882a4c50998b9eb81424ccc8c3c805e11a528595c93d1: Status 404 returned error can't find the container with id 4ddcdef965ede0e947c882a4c50998b9eb81424ccc8c3c805e11a528595c93d1 Apr 16 18:17:49.869014 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.868925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:49.869157 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:49.869090 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:17:49.869204 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:49.869169 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls podName:eff34aa9-7480-480d-b76a-e58afdd3fc46 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:53.869147995 +0000 UTC m=+145.225290042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4szbw" (UID: "eff34aa9-7480-480d-b76a-e58afdd3fc46") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:17:49.970193 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.970162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:49.970342 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:49.970282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:49.970342 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:49.970326 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:53.970305213 +0000 UTC m=+145.326447305 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : configmap references non-existent config key: service-ca.crt Apr 16 18:17:49.970429 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:49.970383 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:17:49.970429 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:49.970422 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:53.970407198 +0000 UTC m=+145.326549233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : secret "router-metrics-certs-default" not found Apr 16 18:17:50.581915 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:50.581882 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-ht7hf" event={"ID":"194cfdef-87c9-4fe6-b52f-1a569ef6e306","Type":"ContainerStarted","Data":"4ddcdef965ede0e947c882a4c50998b9eb81424ccc8c3c805e11a528595c93d1"} Apr 16 18:17:51.585333 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:51.585247 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-ht7hf" event={"ID":"194cfdef-87c9-4fe6-b52f-1a569ef6e306","Type":"ContainerStarted","Data":"d2505027ea3b6d761c296da2a08bd41f48cacfd3efd9c33bf11c80b595eb0316"} Apr 16 18:17:51.601587 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:51.601504 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-ht7hf" podStartSLOduration=0.94645996 podStartE2EDuration="2.601487211s" podCreationTimestamp="2026-04-16 18:17:49 +0000 UTC" firstStartedPulling="2026-04-16 18:17:49.609013331 +0000 UTC m=+140.965155366" lastFinishedPulling="2026-04-16 18:17:51.264040581 +0000 UTC m=+142.620182617" observedRunningTime="2026-04-16 18:17:51.600441379 +0000 UTC m=+142.956583435" watchObservedRunningTime="2026-04-16 18:17:51.601487211 +0000 UTC m=+142.957629271" Apr 16 18:17:52.176928 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:52.176899 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jl2lc_3d37a2e5-3988-4400-95f4-1baaf11b42a8/dns-node-resolver/0.log" Apr 16 18:17:52.976786 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:52.976762 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kd96x_e4ae6d82-301f-44be-85ce-8d3b88e0d6e1/node-ca/0.log" Apr 16 18:17:53.902882 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:53.902827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:17:53.903069 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:53.902981 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:17:53.903069 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:53.903046 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls podName:eff34aa9-7480-480d-b76a-e58afdd3fc46 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:01.903028896 +0000 UTC m=+153.259170932 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4szbw" (UID: "eff34aa9-7480-480d-b76a-e58afdd3fc46") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:17:54.004161 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:54.004119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:54.004501 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:17:54.004220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:17:54.004501 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:54.004284 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:02.004267073 +0000 UTC m=+153.360409107 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : configmap references non-existent config key: service-ca.crt Apr 16 18:17:54.004501 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:54.004315 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:17:54.004501 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:17:54.004354 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:02.004343259 +0000 UTC m=+153.360485295 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : secret "router-metrics-certs-default" not found Apr 16 18:18:01.971359 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:01.971307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:18:01.971773 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:01.971479 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:01.971773 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:01.971595 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls podName:eff34aa9-7480-480d-b76a-e58afdd3fc46 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:17.971570178 +0000 UTC m=+169.327712234 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4szbw" (UID: "eff34aa9-7480-480d-b76a-e58afdd3fc46") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:18:02.072336 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.072296 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:18:02.072510 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.072364 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:18:02.072510 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:02.072479 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:18:02.072611 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:02.072560 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:18.072541893 +0000 UTC m=+169.428683929 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : configmap references non-existent config key: service-ca.crt Apr 16 18:18:02.072611 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:02.072582 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs podName:23253796-d542-44e0-93b4-6b1d65c09948 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:18.072575136 +0000 UTC m=+169.428717171 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs") pod "router-default-596bbb7b7d-nk24r" (UID: "23253796-d542-44e0-93b4-6b1d65c09948") : secret "router-metrics-certs-default" not found Apr 16 18:18:02.217375 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.217344 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-7j5tj"] Apr 16 18:18:02.222095 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.222056 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" Apr 16 18:18:02.224818 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.224790 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:18:02.224937 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.224871 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:18:02.225216 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.225198 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:18:02.225340 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.225322 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-lzthw\"" Apr 16 18:18:02.225414 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.225357 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:18:02.230505 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.230476 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-7j5tj"] Apr 16 18:18:02.273418 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.273380 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2e73ca9-877e-4658-9ca0-029eaf8cbf6d-signing-key\") pod \"service-ca-bfc587fb7-7j5tj\" (UID: \"f2e73ca9-877e-4658-9ca0-029eaf8cbf6d\") " pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" Apr 16 18:18:02.273591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.273462 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2e73ca9-877e-4658-9ca0-029eaf8cbf6d-signing-cabundle\") pod \"service-ca-bfc587fb7-7j5tj\" (UID: \"f2e73ca9-877e-4658-9ca0-029eaf8cbf6d\") " pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" Apr 16 18:18:02.273591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.273496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrr67\" (UniqueName: \"kubernetes.io/projected/f2e73ca9-877e-4658-9ca0-029eaf8cbf6d-kube-api-access-jrr67\") pod \"service-ca-bfc587fb7-7j5tj\" (UID: \"f2e73ca9-877e-4658-9ca0-029eaf8cbf6d\") " pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" Apr 16 18:18:02.374556 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.374499 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2e73ca9-877e-4658-9ca0-029eaf8cbf6d-signing-key\") pod \"service-ca-bfc587fb7-7j5tj\" (UID: \"f2e73ca9-877e-4658-9ca0-029eaf8cbf6d\") " pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" Apr 16 18:18:02.374741 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.374631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2e73ca9-877e-4658-9ca0-029eaf8cbf6d-signing-cabundle\") pod \"service-ca-bfc587fb7-7j5tj\" (UID: \"f2e73ca9-877e-4658-9ca0-029eaf8cbf6d\") " pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" Apr 16 18:18:02.374741 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.374666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrr67\" (UniqueName: \"kubernetes.io/projected/f2e73ca9-877e-4658-9ca0-029eaf8cbf6d-kube-api-access-jrr67\") pod \"service-ca-bfc587fb7-7j5tj\" (UID: \"f2e73ca9-877e-4658-9ca0-029eaf8cbf6d\") " pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" Apr 16 18:18:02.375242 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.375220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2e73ca9-877e-4658-9ca0-029eaf8cbf6d-signing-cabundle\") pod \"service-ca-bfc587fb7-7j5tj\" (UID: \"f2e73ca9-877e-4658-9ca0-029eaf8cbf6d\") " pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" Apr 16 18:18:02.377045 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.377027 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2e73ca9-877e-4658-9ca0-029eaf8cbf6d-signing-key\") pod \"service-ca-bfc587fb7-7j5tj\" (UID: \"f2e73ca9-877e-4658-9ca0-029eaf8cbf6d\") " pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" Apr 16 18:18:02.387645 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.387622 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrr67\" (UniqueName: \"kubernetes.io/projected/f2e73ca9-877e-4658-9ca0-029eaf8cbf6d-kube-api-access-jrr67\") pod \"service-ca-bfc587fb7-7j5tj\" (UID: \"f2e73ca9-877e-4658-9ca0-029eaf8cbf6d\") " pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" Apr 16 18:18:02.529955 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.529875 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" Apr 16 18:18:02.665290 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:02.665263 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-7j5tj"] Apr 16 18:18:02.670869 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:02.670838 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e73ca9_877e_4658_9ca0_029eaf8cbf6d.slice/crio-913ef7aaaa6a8abb84a96644e00574ab32efe347f749db573c8f1c86ecb56e3d WatchSource:0}: Error finding container 913ef7aaaa6a8abb84a96644e00574ab32efe347f749db573c8f1c86ecb56e3d: Status 404 returned error can't find the container with id 913ef7aaaa6a8abb84a96644e00574ab32efe347f749db573c8f1c86ecb56e3d Apr 16 18:18:03.614582 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:03.614548 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" event={"ID":"f2e73ca9-877e-4658-9ca0-029eaf8cbf6d","Type":"ContainerStarted","Data":"913ef7aaaa6a8abb84a96644e00574ab32efe347f749db573c8f1c86ecb56e3d"} Apr 16 18:18:04.617455 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:04.617410 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" event={"ID":"f2e73ca9-877e-4658-9ca0-029eaf8cbf6d","Type":"ContainerStarted","Data":"4993b8c3788bfe1019a3e55914d567e012501261c9f425a9389b843690744ad7"} Apr 16 18:18:04.634921 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:04.634752 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-7j5tj" podStartSLOduration=0.843299973 podStartE2EDuration="2.634733833s" podCreationTimestamp="2026-04-16 18:18:02 +0000 UTC" firstStartedPulling="2026-04-16 18:18:02.673057502 +0000 UTC m=+154.029199541" lastFinishedPulling="2026-04-16 18:18:04.464491355 +0000 UTC m=+155.820633401" observedRunningTime="2026-04-16 18:18:04.63411638 +0000 UTC m=+155.990258439" watchObservedRunningTime="2026-04-16 18:18:04.634733833 +0000 UTC m=+155.990875892" Apr 16 18:18:05.500040 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:05.499982 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" podUID="71df2187-c914-4b58-8d61-6fcaacaefd11" Apr 16 18:18:05.509228 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:05.509198 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" podUID="efc53255-caaf-4b68-9cd3-c6118907f500" Apr 16 18:18:05.533217 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:05.533183 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-bndvz" podUID="0913fe98-7bbc-41d3-9144-086892d07104" Apr 16 18:18:05.549399 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:05.549368 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7vbzm" podUID="8d155535-fa59-4777-80cf-fdba34134958" Apr 16 18:18:05.619231 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:05.619204 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:18:05.619231 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:05.619208 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:18:05.619730 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:05.619205 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bndvz" Apr 16 18:18:07.237770 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:07.237726 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-j9gkk" podUID="3fd8f3ce-1a67-4a38-99ec-e368aea03088" Apr 16 18:18:10.454757 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.454694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:18:10.455262 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.454779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:18:10.455262 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.454823 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:18:10.455262 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.454871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:18:10.455262 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:10.454921 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:18:10.455262 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:10.454942 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:10.455262 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:10.454991 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert podName:8d155535-fa59-4777-80cf-fdba34134958 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:12.454978092 +0000 UTC m=+283.811120127 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert") pod "ingress-canary-7vbzm" (UID: "8d155535-fa59-4777-80cf-fdba34134958") : secret "canary-serving-cert" not found Apr 16 18:18:10.455262 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:18:10.455003 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert podName:71df2187-c914-4b58-8d61-6fcaacaefd11 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:12.45499753 +0000 UTC m=+283.811139564 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-h9xxg" (UID: "71df2187-c914-4b58-8d61-6fcaacaefd11") : secret "networking-console-plugin-cert" not found Apr 16 18:18:10.457233 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.457210 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0913fe98-7bbc-41d3-9144-086892d07104-metrics-tls\") pod \"dns-default-bndvz\" (UID: \"0913fe98-7bbc-41d3-9144-086892d07104\") " pod="openshift-dns/dns-default-bndvz" Apr 16 18:18:10.457384 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.457364 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"image-registry-7fff8cc5c4-dk27h\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:18:10.723266 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.723182 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-846cr\"" Apr 16 18:18:10.723266 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.723181 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fxg25\"" Apr 16 18:18:10.730909 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.730874 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:18:10.731070 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.730933 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bndvz" Apr 16 18:18:10.868057 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.868026 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bndvz"] Apr 16 18:18:10.872541 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:10.872492 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0913fe98_7bbc_41d3_9144_086892d07104.slice/crio-22fd578e05f44d17b548f49ecdc93e13e8496fdeb59510dee5789be13c6db0ee WatchSource:0}: Error finding container 22fd578e05f44d17b548f49ecdc93e13e8496fdeb59510dee5789be13c6db0ee: Status 404 returned error can't find the container with id 22fd578e05f44d17b548f49ecdc93e13e8496fdeb59510dee5789be13c6db0ee Apr 16 18:18:10.888462 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:10.888436 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fff8cc5c4-dk27h"] Apr 16 18:18:10.891887 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:10.891862 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefc53255_caaf_4b68_9cd3_c6118907f500.slice/crio-a42a9f74d9d540446140c08cab9d810b6092e61fbb1dcac5a5d0b7b4404c8d6a WatchSource:0}: Error finding container a42a9f74d9d540446140c08cab9d810b6092e61fbb1dcac5a5d0b7b4404c8d6a: Status 404 returned error can't find the container with id a42a9f74d9d540446140c08cab9d810b6092e61fbb1dcac5a5d0b7b4404c8d6a Apr 16 18:18:11.634370 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:11.634280 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bndvz" event={"ID":"0913fe98-7bbc-41d3-9144-086892d07104","Type":"ContainerStarted","Data":"22fd578e05f44d17b548f49ecdc93e13e8496fdeb59510dee5789be13c6db0ee"} Apr 16 18:18:11.635920 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:11.635888 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" event={"ID":"efc53255-caaf-4b68-9cd3-c6118907f500","Type":"ContainerStarted","Data":"ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43"} Apr 16 18:18:11.636063 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:11.635921 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" event={"ID":"efc53255-caaf-4b68-9cd3-c6118907f500","Type":"ContainerStarted","Data":"a42a9f74d9d540446140c08cab9d810b6092e61fbb1dcac5a5d0b7b4404c8d6a"} Apr 16 18:18:11.636063 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:11.636046 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:18:11.658069 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:11.658016 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" podStartSLOduration=162.657998788 podStartE2EDuration="2m42.657998788s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:18:11.65767616 +0000 UTC m=+163.013818219" watchObservedRunningTime="2026-04-16 18:18:11.657998788 +0000 UTC m=+163.014140845" Apr 16 18:18:12.640666 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:12.640621 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bndvz" event={"ID":"0913fe98-7bbc-41d3-9144-086892d07104","Type":"ContainerStarted","Data":"2a40a7fb07cfea91d1e9d1789233b5cd6feaaf6fd5ad28d5731df4663a7dbb1b"} Apr 16 18:18:12.640666 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:12.640671 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bndvz" event={"ID":"0913fe98-7bbc-41d3-9144-086892d07104","Type":"ContainerStarted","Data":"e1354364a80d8a7ef2e3ec927962fc6de0125a49fb80b3be92caab07b45b0c34"} Apr 16 18:18:12.658060 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:12.657994 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bndvz" podStartSLOduration=129.303068698 podStartE2EDuration="2m10.657979384s" podCreationTimestamp="2026-04-16 18:16:02 +0000 UTC" firstStartedPulling="2026-04-16 18:18:10.874867434 +0000 UTC m=+162.231009469" lastFinishedPulling="2026-04-16 18:18:12.229778119 +0000 UTC m=+163.585920155" observedRunningTime="2026-04-16 18:18:12.657180984 +0000 UTC m=+164.013323046" watchObservedRunningTime="2026-04-16 18:18:12.657979384 +0000 UTC m=+164.014121440" Apr 16 18:18:13.643918 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:13.643888 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bndvz" Apr 16 18:18:16.217599 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:16.217556 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:18:18.024331 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.024243 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:18:18.027245 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.027224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/eff34aa9-7480-480d-b76a-e58afdd3fc46-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4szbw\" (UID: \"eff34aa9-7480-480d-b76a-e58afdd3fc46\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:18:18.125688 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.125654 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:18:18.125858 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.125726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:18:18.126383 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.126319 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23253796-d542-44e0-93b4-6b1d65c09948-service-ca-bundle\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:18:18.128074 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.128052 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23253796-d542-44e0-93b4-6b1d65c09948-metrics-certs\") pod \"router-default-596bbb7b7d-nk24r\" (UID: \"23253796-d542-44e0-93b4-6b1d65c09948\") " pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:18:18.269588 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.269499 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" Apr 16 18:18:18.379081 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.379049 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:18:18.397496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.397421 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw"] Apr 16 18:18:18.402819 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:18.402782 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeff34aa9_7480_480d_b76a_e58afdd3fc46.slice/crio-50a569f058fa354fe8db4053e69d6ad388ba67ea0dbf54ae19274125c2f96bc2 WatchSource:0}: Error finding container 50a569f058fa354fe8db4053e69d6ad388ba67ea0dbf54ae19274125c2f96bc2: Status 404 returned error can't find the container with id 50a569f058fa354fe8db4053e69d6ad388ba67ea0dbf54ae19274125c2f96bc2 Apr 16 18:18:18.500240 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.500205 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-596bbb7b7d-nk24r"] Apr 16 18:18:18.503453 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:18.503422 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23253796_d542_44e0_93b4_6b1d65c09948.slice/crio-ee72116188a35d1a6da702b2882bb2804b39aa2d20a11cf8b1781f96bfd8e5ee WatchSource:0}: Error finding container ee72116188a35d1a6da702b2882bb2804b39aa2d20a11cf8b1781f96bfd8e5ee: Status 404 returned error can't find the container with id ee72116188a35d1a6da702b2882bb2804b39aa2d20a11cf8b1781f96bfd8e5ee Apr 16 18:18:18.657476 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.657437 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-596bbb7b7d-nk24r" event={"ID":"23253796-d542-44e0-93b4-6b1d65c09948","Type":"ContainerStarted","Data":"cf162b5678e7a4afb7695049c3b318c72d044c57d8d6684d1aa0cd0fca5284a0"} Apr 16 18:18:18.657476 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.657480 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-596bbb7b7d-nk24r" event={"ID":"23253796-d542-44e0-93b4-6b1d65c09948","Type":"ContainerStarted","Data":"ee72116188a35d1a6da702b2882bb2804b39aa2d20a11cf8b1781f96bfd8e5ee"} Apr 16 18:18:18.658440 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.658418 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" event={"ID":"eff34aa9-7480-480d-b76a-e58afdd3fc46","Type":"ContainerStarted","Data":"50a569f058fa354fe8db4053e69d6ad388ba67ea0dbf54ae19274125c2f96bc2"} Apr 16 18:18:18.680675 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:18.680468 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-596bbb7b7d-nk24r" podStartSLOduration=32.680452028 podStartE2EDuration="32.680452028s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:18:18.67970651 +0000 UTC m=+170.035848566" watchObservedRunningTime="2026-04-16 18:18:18.680452028 +0000 UTC m=+170.036594086" Apr 16 18:18:19.219289 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:19.219261 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:18:19.380247 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:19.380209 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:18:19.383296 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:19.383265 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:18:19.662598 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:19.662562 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:18:19.663979 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:19.663953 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-596bbb7b7d-nk24r" Apr 16 18:18:20.666913 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:20.666876 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" event={"ID":"eff34aa9-7480-480d-b76a-e58afdd3fc46","Type":"ContainerStarted","Data":"9f0b295472db2d33b17cdd9ac1c2a371ba2dfdca8c6376f53007a5bb5046c4ec"} Apr 16 18:18:20.683458 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:20.683402 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4szbw" podStartSLOduration=32.941549028 podStartE2EDuration="34.683386282s" podCreationTimestamp="2026-04-16 18:17:46 +0000 UTC" firstStartedPulling="2026-04-16 18:18:18.404698871 +0000 UTC m=+169.760840907" lastFinishedPulling="2026-04-16 18:18:20.146536121 +0000 UTC m=+171.502678161" observedRunningTime="2026-04-16 18:18:20.682203591 +0000 UTC m=+172.038345645" watchObservedRunningTime="2026-04-16 18:18:20.683386282 +0000 UTC m=+172.039528337" Apr 16 18:18:21.535478 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.535445 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-v76fn"] Apr 16 18:18:21.537579 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.537562 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.540220 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.540193 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:18:21.540324 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.540229 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:18:21.540398 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.540195 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-czcrj\"" Apr 16 18:18:21.551921 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.551894 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/af10a82c-41b9-4f94-83b7-46e390179f35-crio-socket\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.552042 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.551947 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/af10a82c-41b9-4f94-83b7-46e390179f35-data-volume\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.552109 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.552042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/af10a82c-41b9-4f94-83b7-46e390179f35-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.552109 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.552087 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wx6h\" (UniqueName: \"kubernetes.io/projected/af10a82c-41b9-4f94-83b7-46e390179f35-kube-api-access-2wx6h\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.552208 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.552143 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/af10a82c-41b9-4f94-83b7-46e390179f35-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.553286 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.553264 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v76fn"] Apr 16 18:18:21.653348 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.653317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/af10a82c-41b9-4f94-83b7-46e390179f35-crio-socket\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.653348 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.653353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/af10a82c-41b9-4f94-83b7-46e390179f35-data-volume\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.653600 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.653381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/af10a82c-41b9-4f94-83b7-46e390179f35-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.653600 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.653402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wx6h\" (UniqueName: \"kubernetes.io/projected/af10a82c-41b9-4f94-83b7-46e390179f35-kube-api-access-2wx6h\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.653600 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.653434 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/af10a82c-41b9-4f94-83b7-46e390179f35-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.653600 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.653439 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/af10a82c-41b9-4f94-83b7-46e390179f35-crio-socket\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.653805 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.653784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/af10a82c-41b9-4f94-83b7-46e390179f35-data-volume\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.653958 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.653935 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/af10a82c-41b9-4f94-83b7-46e390179f35-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.655914 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.655890 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/af10a82c-41b9-4f94-83b7-46e390179f35-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.666998 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.666975 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wx6h\" (UniqueName: \"kubernetes.io/projected/af10a82c-41b9-4f94-83b7-46e390179f35-kube-api-access-2wx6h\") pod \"insights-runtime-extractor-v76fn\" (UID: \"af10a82c-41b9-4f94-83b7-46e390179f35\") " pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.846672 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.846576 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v76fn" Apr 16 18:18:21.984087 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:21.984052 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v76fn"] Apr 16 18:18:21.986881 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:21.986847 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf10a82c_41b9_4f94_83b7_46e390179f35.slice/crio-d800570d499f684c72451e0225546169aadbc97ca709f2f766bd6d23c8515c29 WatchSource:0}: Error finding container d800570d499f684c72451e0225546169aadbc97ca709f2f766bd6d23c8515c29: Status 404 returned error can't find the container with id d800570d499f684c72451e0225546169aadbc97ca709f2f766bd6d23c8515c29 Apr 16 18:18:22.673044 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:22.673012 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v76fn" event={"ID":"af10a82c-41b9-4f94-83b7-46e390179f35","Type":"ContainerStarted","Data":"22f2bc965e22810b9e1f98a384ed5fe9512f5131966589c3fd2ff2acbea4fbba"} Apr 16 18:18:22.673044 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:22.673049 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v76fn" event={"ID":"af10a82c-41b9-4f94-83b7-46e390179f35","Type":"ContainerStarted","Data":"498cb870d8ef3d62f9b408d1ff54a001f3ab7f8a044310ab2de57e8677952674"} Apr 16 18:18:22.673427 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:22.673060 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v76fn" event={"ID":"af10a82c-41b9-4f94-83b7-46e390179f35","Type":"ContainerStarted","Data":"d800570d499f684c72451e0225546169aadbc97ca709f2f766bd6d23c8515c29"} Apr 16 18:18:23.649088 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:23.649057 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bndvz" Apr 16 18:18:24.680782 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.680703 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v76fn" event={"ID":"af10a82c-41b9-4f94-83b7-46e390179f35","Type":"ContainerStarted","Data":"aa9f365ad4947bdab400d8d9dda25789307202764e61af502b3c7f111c4865e1"} Apr 16 18:18:24.703609 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.703552 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-v76fn" podStartSLOduration=1.9717020920000001 podStartE2EDuration="3.703509717s" podCreationTimestamp="2026-04-16 18:18:21 +0000 UTC" firstStartedPulling="2026-04-16 18:18:22.035399 +0000 UTC m=+173.391541035" lastFinishedPulling="2026-04-16 18:18:23.767206626 +0000 UTC m=+175.123348660" observedRunningTime="2026-04-16 18:18:24.701904723 +0000 UTC m=+176.058046781" watchObservedRunningTime="2026-04-16 18:18:24.703509717 +0000 UTC m=+176.059651774" Apr 16 18:18:24.715221 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.715195 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-45887"] Apr 16 18:18:24.717347 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.717331 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.719466 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.719440 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:18:24.719863 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.719843 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:18:24.719985 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.719961 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:18:24.720135 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.720116 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-xxjln\"" Apr 16 18:18:24.728575 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.728553 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-45887"] Apr 16 18:18:24.779743 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.779703 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb136e76-6bdc-4e39-b76b-d994ee40f867-metrics-client-ca\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.779743 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.779742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tn8d\" (UniqueName: \"kubernetes.io/projected/cb136e76-6bdc-4e39-b76b-d994ee40f867-kube-api-access-9tn8d\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.780007 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.779846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb136e76-6bdc-4e39-b76b-d994ee40f867-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.780007 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.779895 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb136e76-6bdc-4e39-b76b-d994ee40f867-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.881156 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.881110 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb136e76-6bdc-4e39-b76b-d994ee40f867-metrics-client-ca\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.881156 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.881151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tn8d\" (UniqueName: \"kubernetes.io/projected/cb136e76-6bdc-4e39-b76b-d994ee40f867-kube-api-access-9tn8d\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.881334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.881191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb136e76-6bdc-4e39-b76b-d994ee40f867-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.881334 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.881223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb136e76-6bdc-4e39-b76b-d994ee40f867-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.881861 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.881838 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb136e76-6bdc-4e39-b76b-d994ee40f867-metrics-client-ca\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.883825 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.883801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb136e76-6bdc-4e39-b76b-d994ee40f867-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.883922 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.883870 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb136e76-6bdc-4e39-b76b-d994ee40f867-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:24.890855 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:24.890833 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tn8d\" (UniqueName: \"kubernetes.io/projected/cb136e76-6bdc-4e39-b76b-d994ee40f867-kube-api-access-9tn8d\") pod \"prometheus-operator-78f957474d-45887\" (UID: \"cb136e76-6bdc-4e39-b76b-d994ee40f867\") " pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:25.026678 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:25.026581 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-45887" Apr 16 18:18:25.146952 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:25.146918 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-45887"] Apr 16 18:18:25.149818 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:25.149789 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb136e76_6bdc_4e39_b76b_d994ee40f867.slice/crio-a2297f87b053249c661d1671816091662a1fdfd09d78d689fdf8d734480d01a9 WatchSource:0}: Error finding container a2297f87b053249c661d1671816091662a1fdfd09d78d689fdf8d734480d01a9: Status 404 returned error can't find the container with id a2297f87b053249c661d1671816091662a1fdfd09d78d689fdf8d734480d01a9 Apr 16 18:18:25.684707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:25.684667 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-45887" event={"ID":"cb136e76-6bdc-4e39-b76b-d994ee40f867","Type":"ContainerStarted","Data":"a2297f87b053249c661d1671816091662a1fdfd09d78d689fdf8d734480d01a9"} Apr 16 18:18:26.690397 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:26.690355 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-45887" event={"ID":"cb136e76-6bdc-4e39-b76b-d994ee40f867","Type":"ContainerStarted","Data":"ab16d1d506870f347f8608cc678c321930d3bdd8b836dae8098b7a313823fb91"} Apr 16 18:18:26.690397 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:26.690401 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-45887" event={"ID":"cb136e76-6bdc-4e39-b76b-d994ee40f867","Type":"ContainerStarted","Data":"307647eb7fbb40cc0546d38fede7f12bd89626feea01eb8326a90b86dd6c74c8"} Apr 16 18:18:26.706984 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:26.706835 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-45887" podStartSLOduration=1.596663825 podStartE2EDuration="2.70681927s" podCreationTimestamp="2026-04-16 18:18:24 +0000 UTC" firstStartedPulling="2026-04-16 18:18:25.151659159 +0000 UTC m=+176.507801198" lastFinishedPulling="2026-04-16 18:18:26.261814608 +0000 UTC m=+177.617956643" observedRunningTime="2026-04-16 18:18:26.706412798 +0000 UTC m=+178.062554855" watchObservedRunningTime="2026-04-16 18:18:26.70681927 +0000 UTC m=+178.062961328" Apr 16 18:18:29.114135 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.114091 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-89djw"] Apr 16 18:18:29.117934 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.117908 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.125958 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.125843 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:18:29.131332 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.126680 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-bcw7g\"" Apr 16 18:18:29.131332 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.126949 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:18:29.148977 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.148945 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-89djw"] Apr 16 18:18:29.161534 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.161480 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4lmm7"] Apr 16 18:18:29.165100 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.165079 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-fmlhz"] Apr 16 18:18:29.165272 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.165256 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.167413 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.167392 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:18:29.167884 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.167868 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:18:29.168293 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.168272 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.168385 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.168314 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:18:29.168972 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.168945 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-g6plr\"" Apr 16 18:18:29.172317 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.172295 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:18:29.172461 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.172445 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:18:29.173019 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.173000 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:18:29.173314 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.173300 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-tzsvh\"" Apr 16 18:18:29.186113 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.186092 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-fmlhz"] Apr 16 18:18:29.215191 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215145 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-accelerators-collector-config\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.215373 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215198 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-wtmp\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.215373 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215268 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae6e84d4-e669-49d1-8b06-5404b460d95f-metrics-client-ca\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.215373 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.215497 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215374 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44ea6a5b-fa62-492e-8885-5836fde6aae9-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.215497 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215394 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.215497 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215410 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ae6e84d4-e669-49d1-8b06-5404b460d95f-root\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.215497 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215447 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsvfd\" (UniqueName: \"kubernetes.io/projected/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-api-access-rsvfd\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.215497 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215490 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.215735 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215545 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.215735 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215574 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.215735 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215601 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-textfile\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.215735 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215630 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.215735 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215654 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae6e84d4-e669-49d1-8b06-5404b460d95f-sys\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.215735 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.215735 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215708 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-tls\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.215992 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215737 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln2gk\" (UniqueName: \"kubernetes.io/projected/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-kube-api-access-ln2gk\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.215992 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215762 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/44ea6a5b-fa62-492e-8885-5836fde6aae9-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.215992 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.215786 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmm2b\" (UniqueName: \"kubernetes.io/projected/ae6e84d4-e669-49d1-8b06-5404b460d95f-kube-api-access-lmm2b\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.316857 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.316805 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae6e84d4-e669-49d1-8b06-5404b460d95f-sys\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.316857 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.316856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.317172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.316876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-tls\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.317172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.316901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln2gk\" (UniqueName: \"kubernetes.io/projected/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-kube-api-access-ln2gk\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.317172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.316910 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae6e84d4-e669-49d1-8b06-5404b460d95f-sys\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.317172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.316923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/44ea6a5b-fa62-492e-8885-5836fde6aae9-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.317172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.316942 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmm2b\" (UniqueName: \"kubernetes.io/projected/ae6e84d4-e669-49d1-8b06-5404b460d95f-kube-api-access-lmm2b\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.317172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-accelerators-collector-config\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.317172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-wtmp\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.317172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae6e84d4-e669-49d1-8b06-5404b460d95f-metrics-client-ca\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.317172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317179 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44ea6a5b-fa62-492e-8885-5836fde6aae9-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317231 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ae6e84d4-e669-49d1-8b06-5404b460d95f-root\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317243 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-wtmp\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsvfd\" (UniqueName: \"kubernetes.io/projected/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-api-access-rsvfd\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317297 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ae6e84d4-e669-49d1-8b06-5404b460d95f-root\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317414 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-textfile\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317442 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.317630 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317551 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/44ea6a5b-fa62-492e-8885-5836fde6aae9-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.318177 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.317952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-accelerators-collector-config\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.318229 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.318190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.318655 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.318634 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-textfile\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.318788 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.318647 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.318865 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.318686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44ea6a5b-fa62-492e-8885-5836fde6aae9-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.318957 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.318907 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae6e84d4-e669-49d1-8b06-5404b460d95f-metrics-client-ca\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.320749 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.320726 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.321104 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.321079 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.321225 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.321201 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.321285 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.321252 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.321285 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.321260 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.321386 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.321366 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ae6e84d4-e669-49d1-8b06-5404b460d95f-node-exporter-tls\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.324746 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.324726 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln2gk\" (UniqueName: \"kubernetes.io/projected/a10b83fc-c6ce-49cc-9a26-8aa7bc948c27-kube-api-access-ln2gk\") pod \"openshift-state-metrics-5669946b84-89djw\" (UID: \"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.325816 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.325798 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsvfd\" (UniqueName: \"kubernetes.io/projected/44ea6a5b-fa62-492e-8885-5836fde6aae9-kube-api-access-rsvfd\") pod \"kube-state-metrics-7479c89684-fmlhz\" (UID: \"44ea6a5b-fa62-492e-8885-5836fde6aae9\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.326142 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.326127 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmm2b\" (UniqueName: \"kubernetes.io/projected/ae6e84d4-e669-49d1-8b06-5404b460d95f-kube-api-access-lmm2b\") pod \"node-exporter-4lmm7\" (UID: \"ae6e84d4-e669-49d1-8b06-5404b460d95f\") " pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.435567 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.435512 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" Apr 16 18:18:29.477944 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.477913 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4lmm7" Apr 16 18:18:29.484086 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.483714 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" Apr 16 18:18:29.492791 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:29.492751 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae6e84d4_e669_49d1_8b06_5404b460d95f.slice/crio-bfde0350802366ec6896724523c542fa2ee04871e1d6b88553aa22ea41446716 WatchSource:0}: Error finding container bfde0350802366ec6896724523c542fa2ee04871e1d6b88553aa22ea41446716: Status 404 returned error can't find the container with id bfde0350802366ec6896724523c542fa2ee04871e1d6b88553aa22ea41446716 Apr 16 18:18:29.575502 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.575454 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-89djw"] Apr 16 18:18:29.580795 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:29.580765 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda10b83fc_c6ce_49cc_9a26_8aa7bc948c27.slice/crio-22d500529e0780213680133f996fae7fd54c51cc822ead17684d9273f987c506 WatchSource:0}: Error finding container 22d500529e0780213680133f996fae7fd54c51cc822ead17684d9273f987c506: Status 404 returned error can't find the container with id 22d500529e0780213680133f996fae7fd54c51cc822ead17684d9273f987c506 Apr 16 18:18:29.630977 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.630945 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-fmlhz"] Apr 16 18:18:29.633786 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:29.633756 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ea6a5b_fa62_492e_8885_5836fde6aae9.slice/crio-e7b599ade6d9bc0a168b789dd8bfb8f06bdb9e2700810c3aca6ef215f052e8eb WatchSource:0}: Error finding container e7b599ade6d9bc0a168b789dd8bfb8f06bdb9e2700810c3aca6ef215f052e8eb: Status 404 returned error can't find the container with id e7b599ade6d9bc0a168b789dd8bfb8f06bdb9e2700810c3aca6ef215f052e8eb Apr 16 18:18:29.699597 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.699553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4lmm7" event={"ID":"ae6e84d4-e669-49d1-8b06-5404b460d95f","Type":"ContainerStarted","Data":"bfde0350802366ec6896724523c542fa2ee04871e1d6b88553aa22ea41446716"} Apr 16 18:18:29.700702 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.700676 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" event={"ID":"44ea6a5b-fa62-492e-8885-5836fde6aae9","Type":"ContainerStarted","Data":"e7b599ade6d9bc0a168b789dd8bfb8f06bdb9e2700810c3aca6ef215f052e8eb"} Apr 16 18:18:29.702652 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.702624 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" event={"ID":"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27","Type":"ContainerStarted","Data":"d720cdb1ebbe0dfee6b481fb8fc737a9e06210116a90d946f0f6e7fde44fb177"} Apr 16 18:18:29.702891 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:29.702659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" event={"ID":"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27","Type":"ContainerStarted","Data":"22d500529e0780213680133f996fae7fd54c51cc822ead17684d9273f987c506"} Apr 16 18:18:30.707615 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:30.707491 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4lmm7" event={"ID":"ae6e84d4-e669-49d1-8b06-5404b460d95f","Type":"ContainerStarted","Data":"7eea3953fb6f35c498019622d7f6fb9606c1b7238e381cfc66ebc660ece4c697"} Apr 16 18:18:30.709654 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:30.709623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" event={"ID":"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27","Type":"ContainerStarted","Data":"b33f06963c562ff1ed85c03911dbcafdb20bd231bf674b7ee349cd0de42c1568"} Apr 16 18:18:30.736835 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:30.736728 2573 patch_prober.go:28] interesting pod/image-registry-7fff8cc5c4-dk27h container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:18:30.736835 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:30.736807 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" podUID="efc53255-caaf-4b68-9cd3-c6118907f500" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:18:31.134069 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.134039 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6d9f96dc59-snn89"] Apr 16 18:18:31.139580 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.139560 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.141997 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.141975 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:18:31.141997 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.141991 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:18:31.142166 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.142022 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-dhim4h1fmkps4\"" Apr 16 18:18:31.142338 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.142323 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:18:31.142390 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.142339 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:18:31.142612 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.142594 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:18:31.142612 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.142610 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-7l9vh\"" Apr 16 18:18:31.151968 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.151943 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6d9f96dc59-snn89"] Apr 16 18:18:31.236608 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.236578 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8hqk\" (UniqueName: \"kubernetes.io/projected/fd636b06-9e7a-4f60-ad21-00cd7e60a593-kube-api-access-m8hqk\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.236727 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.236636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd636b06-9e7a-4f60-ad21-00cd7e60a593-metrics-client-ca\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.236727 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.236699 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.236834 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.236735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.236834 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.236813 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.236934 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.236853 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-tls\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.236934 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.236879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.236934 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.236911 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-grpc-tls\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.337646 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.337555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.337646 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.337629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-tls\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.337954 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.337661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.337954 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.337696 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-grpc-tls\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.337954 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.337719 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8hqk\" (UniqueName: \"kubernetes.io/projected/fd636b06-9e7a-4f60-ad21-00cd7e60a593-kube-api-access-m8hqk\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.337954 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.337755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd636b06-9e7a-4f60-ad21-00cd7e60a593-metrics-client-ca\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.337954 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.337812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.337954 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.337839 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.339216 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.338895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd636b06-9e7a-4f60-ad21-00cd7e60a593-metrics-client-ca\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.343085 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.343030 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.343228 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.343185 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.344579 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.344554 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.345886 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.344347 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-grpc-tls\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.346051 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.345912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-tls\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.346824 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.346788 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fd636b06-9e7a-4f60-ad21-00cd7e60a593-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.357492 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.357461 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8hqk\" (UniqueName: \"kubernetes.io/projected/fd636b06-9e7a-4f60-ad21-00cd7e60a593-kube-api-access-m8hqk\") pod \"thanos-querier-6d9f96dc59-snn89\" (UID: \"fd636b06-9e7a-4f60-ad21-00cd7e60a593\") " pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.513461 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.513420 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:31.645190 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.645117 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6d9f96dc59-snn89"] Apr 16 18:18:31.648267 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:31.648240 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd636b06_9e7a_4f60_ad21_00cd7e60a593.slice/crio-28947e89c534686cc6f885c6acedfb6a49ada663b4fb64721a6283447024de0d WatchSource:0}: Error finding container 28947e89c534686cc6f885c6acedfb6a49ada663b4fb64721a6283447024de0d: Status 404 returned error can't find the container with id 28947e89c534686cc6f885c6acedfb6a49ada663b4fb64721a6283447024de0d Apr 16 18:18:31.713832 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.713797 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" event={"ID":"fd636b06-9e7a-4f60-ad21-00cd7e60a593","Type":"ContainerStarted","Data":"28947e89c534686cc6f885c6acedfb6a49ada663b4fb64721a6283447024de0d"} Apr 16 18:18:31.715048 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.715025 2573 generic.go:358] "Generic (PLEG): container finished" podID="ae6e84d4-e669-49d1-8b06-5404b460d95f" containerID="7eea3953fb6f35c498019622d7f6fb9606c1b7238e381cfc66ebc660ece4c697" exitCode=0 Apr 16 18:18:31.715173 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.715099 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4lmm7" event={"ID":"ae6e84d4-e669-49d1-8b06-5404b460d95f","Type":"ContainerDied","Data":"7eea3953fb6f35c498019622d7f6fb9606c1b7238e381cfc66ebc660ece4c697"} Apr 16 18:18:31.717052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.716995 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" event={"ID":"44ea6a5b-fa62-492e-8885-5836fde6aae9","Type":"ContainerStarted","Data":"a2504e3746b941993e4c4890cb1e74f0d68a40e5bb836e89b85dbd4b17ede498"} Apr 16 18:18:31.717052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.717051 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" event={"ID":"44ea6a5b-fa62-492e-8885-5836fde6aae9","Type":"ContainerStarted","Data":"2b32149a4d16edf1cee19232543f23db0d5c851bf4a27d5af768d4875fcc188d"} Apr 16 18:18:31.717186 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.717064 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" event={"ID":"44ea6a5b-fa62-492e-8885-5836fde6aae9","Type":"ContainerStarted","Data":"d5996e1a1bba0b96a604ad7401ef8465ebefae1eaea4663d864bb0b0b3ad06dd"} Apr 16 18:18:31.718930 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.718905 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" event={"ID":"a10b83fc-c6ce-49cc-9a26-8aa7bc948c27","Type":"ContainerStarted","Data":"f164df0ad2598a561b853210d02975ec4c74e2c48799702424414f0f50d7395b"} Apr 16 18:18:31.749632 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.749588 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-89djw" podStartSLOduration=1.34558273 podStartE2EDuration="2.749572802s" podCreationTimestamp="2026-04-16 18:18:29 +0000 UTC" firstStartedPulling="2026-04-16 18:18:29.706636448 +0000 UTC m=+181.062778483" lastFinishedPulling="2026-04-16 18:18:31.11062652 +0000 UTC m=+182.466768555" observedRunningTime="2026-04-16 18:18:31.748348399 +0000 UTC m=+183.104490456" watchObservedRunningTime="2026-04-16 18:18:31.749572802 +0000 UTC m=+183.105714859" Apr 16 18:18:31.773379 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:31.773329 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-fmlhz" podStartSLOduration=1.296293791 podStartE2EDuration="2.773310584s" podCreationTimestamp="2026-04-16 18:18:29 +0000 UTC" firstStartedPulling="2026-04-16 18:18:29.635707339 +0000 UTC m=+180.991849374" lastFinishedPulling="2026-04-16 18:18:31.112724133 +0000 UTC m=+182.468866167" observedRunningTime="2026-04-16 18:18:31.771994223 +0000 UTC m=+183.128136291" watchObservedRunningTime="2026-04-16 18:18:31.773310584 +0000 UTC m=+183.129452643" Apr 16 18:18:32.646816 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:32.646781 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:18:32.725442 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:32.725407 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4lmm7" event={"ID":"ae6e84d4-e669-49d1-8b06-5404b460d95f","Type":"ContainerStarted","Data":"bf6e69292174c308982e23e42a570d24671204085998e56632c54465870cd04a"} Apr 16 18:18:32.725917 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:32.725457 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4lmm7" event={"ID":"ae6e84d4-e669-49d1-8b06-5404b460d95f","Type":"ContainerStarted","Data":"823ea80fdddbf330adf7bb9772c896a1acec3c9c257eb7fa38b8896c9ce4ca6f"} Apr 16 18:18:32.749199 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:32.749135 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4lmm7" podStartSLOduration=2.828283962 podStartE2EDuration="3.74911627s" podCreationTimestamp="2026-04-16 18:18:29 +0000 UTC" firstStartedPulling="2026-04-16 18:18:29.494971285 +0000 UTC m=+180.851113333" lastFinishedPulling="2026-04-16 18:18:30.415803603 +0000 UTC m=+181.771945641" observedRunningTime="2026-04-16 18:18:32.747961605 +0000 UTC m=+184.104103673" watchObservedRunningTime="2026-04-16 18:18:32.74911627 +0000 UTC m=+184.105258326" Apr 16 18:18:33.729654 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:33.729618 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" event={"ID":"fd636b06-9e7a-4f60-ad21-00cd7e60a593","Type":"ContainerStarted","Data":"34a042000d305371424c3f509fee28941054d76896ab2d8f7e2425a39091b0dd"} Apr 16 18:18:33.729654 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:33.729656 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" event={"ID":"fd636b06-9e7a-4f60-ad21-00cd7e60a593","Type":"ContainerStarted","Data":"d5cbb8e8f7713e762f74149c20c59225d18f586ea8f1ccf3ee6282f6e188ae01"} Apr 16 18:18:33.730061 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:33.729666 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" event={"ID":"fd636b06-9e7a-4f60-ad21-00cd7e60a593","Type":"ContainerStarted","Data":"eb85f92013706748f8037ab0fabb4eb823c5b4a6e429b3a21eb2f1a79a1e1d36"} Apr 16 18:18:34.736496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:34.736464 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" event={"ID":"fd636b06-9e7a-4f60-ad21-00cd7e60a593","Type":"ContainerStarted","Data":"28cd59b7749203d301b0ae55792173d52eec7c7ffae65c2a0b7e0042266c53d5"} Apr 16 18:18:34.736496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:34.736503 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" event={"ID":"fd636b06-9e7a-4f60-ad21-00cd7e60a593","Type":"ContainerStarted","Data":"25030d74a1172ab164eff2be2036a50499f2b18f313e3f1caff59ba8020a9ae4"} Apr 16 18:18:35.741708 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:35.741665 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" event={"ID":"fd636b06-9e7a-4f60-ad21-00cd7e60a593","Type":"ContainerStarted","Data":"fa3826a122115eb3a36e642810a3c6e53ff87bbc94abe296f08e2f9b20a959ba"} Apr 16 18:18:35.742076 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:35.741831 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:35.766831 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:35.766761 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" podStartSLOduration=1.8350586340000001 podStartE2EDuration="4.766741382s" podCreationTimestamp="2026-04-16 18:18:31 +0000 UTC" firstStartedPulling="2026-04-16 18:18:31.650501737 +0000 UTC m=+183.006643772" lastFinishedPulling="2026-04-16 18:18:34.582184481 +0000 UTC m=+185.938326520" observedRunningTime="2026-04-16 18:18:35.764400409 +0000 UTC m=+187.120542466" watchObservedRunningTime="2026-04-16 18:18:35.766741382 +0000 UTC m=+187.122883440" Apr 16 18:18:41.750448 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:41.750416 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6d9f96dc59-snn89" Apr 16 18:18:44.103099 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:44.103063 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7fff8cc5c4-dk27h"] Apr 16 18:18:53.037150 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.037111 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7946f4fd7c-7222d"] Apr 16 18:18:53.043085 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.043053 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.045459 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.045427 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:18:53.045614 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.045482 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:18:53.045614 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.045490 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:18:53.046407 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.046387 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:18:53.046511 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.046413 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:18:53.046511 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.046455 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:18:53.046705 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.046572 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:18:53.046766 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.046705 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bkqlw\"" Apr 16 18:18:53.054975 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.054945 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7946f4fd7c-7222d"] Apr 16 18:18:53.057025 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.057000 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:18:53.143197 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.143155 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-console-config\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.143197 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.143197 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q5m9\" (UniqueName: \"kubernetes.io/projected/531e8f1d-a948-4acd-877e-2e55ec956c36-kube-api-access-6q5m9\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.143395 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.143282 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-oauth-config\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.143395 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.143312 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-service-ca\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.143462 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.143403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-serving-cert\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.143462 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.143443 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-oauth-serving-cert\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.143541 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.143463 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-trusted-ca-bundle\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.244489 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.244448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-oauth-config\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.244489 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.244498 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-service-ca\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.244799 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.244595 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-serving-cert\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.244799 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.244631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-oauth-serving-cert\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.244799 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.244658 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-trusted-ca-bundle\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.244799 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.244706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-console-config\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.244799 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.244732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q5m9\" (UniqueName: \"kubernetes.io/projected/531e8f1d-a948-4acd-877e-2e55ec956c36-kube-api-access-6q5m9\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.245374 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.245342 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-service-ca\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.245479 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.245449 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-oauth-serving-cert\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.245567 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.245548 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-console-config\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.245663 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.245642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-trusted-ca-bundle\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.247680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.247658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-serving-cert\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.247785 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.247729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-oauth-config\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.252586 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.252565 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q5m9\" (UniqueName: \"kubernetes.io/projected/531e8f1d-a948-4acd-877e-2e55ec956c36-kube-api-access-6q5m9\") pod \"console-7946f4fd7c-7222d\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.355758 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.355668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:18:53.478741 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.478704 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7946f4fd7c-7222d"] Apr 16 18:18:53.483259 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:18:53.483229 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod531e8f1d_a948_4acd_877e_2e55ec956c36.slice/crio-5bd7fbadf74398c8c6d8c82a69bc454c5f57abb6d346638fc603d5a8d12b071d WatchSource:0}: Error finding container 5bd7fbadf74398c8c6d8c82a69bc454c5f57abb6d346638fc603d5a8d12b071d: Status 404 returned error can't find the container with id 5bd7fbadf74398c8c6d8c82a69bc454c5f57abb6d346638fc603d5a8d12b071d Apr 16 18:18:53.792293 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:53.792255 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7946f4fd7c-7222d" event={"ID":"531e8f1d-a948-4acd-877e-2e55ec956c36","Type":"ContainerStarted","Data":"5bd7fbadf74398c8c6d8c82a69bc454c5f57abb6d346638fc603d5a8d12b071d"} Apr 16 18:18:56.801498 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:56.801462 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7946f4fd7c-7222d" event={"ID":"531e8f1d-a948-4acd-877e-2e55ec956c36","Type":"ContainerStarted","Data":"5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1"} Apr 16 18:18:56.818119 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:56.818071 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7946f4fd7c-7222d" podStartSLOduration=1.294778497 podStartE2EDuration="3.818053727s" podCreationTimestamp="2026-04-16 18:18:53 +0000 UTC" firstStartedPulling="2026-04-16 18:18:53.485390678 +0000 UTC m=+204.841532712" lastFinishedPulling="2026-04-16 18:18:56.008665904 +0000 UTC m=+207.364807942" observedRunningTime="2026-04-16 18:18:56.817681019 +0000 UTC m=+208.173823082" watchObservedRunningTime="2026-04-16 18:18:56.818053727 +0000 UTC m=+208.174195787" Apr 16 18:18:59.811372 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:59.811337 2573 generic.go:358] "Generic (PLEG): container finished" podID="3901ff32-acaf-4296-9b6e-811ec88ce688" containerID="8ab45cc2d4a989b9a6e28d22ee4bcdf4ea216207a10a688bba587348121cf7a3" exitCode=0 Apr 16 18:18:59.811756 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:59.811373 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" event={"ID":"3901ff32-acaf-4296-9b6e-811ec88ce688","Type":"ContainerDied","Data":"8ab45cc2d4a989b9a6e28d22ee4bcdf4ea216207a10a688bba587348121cf7a3"} Apr 16 18:18:59.811756 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:18:59.811733 2573 scope.go:117] "RemoveContainer" containerID="8ab45cc2d4a989b9a6e28d22ee4bcdf4ea216207a10a688bba587348121cf7a3" Apr 16 18:19:00.756818 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:00.756789 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-596bbb7b7d-nk24r_23253796-d542-44e0-93b4-6b1d65c09948/router/0.log" Apr 16 18:19:00.821680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:00.821649 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-kjrt4" event={"ID":"3901ff32-acaf-4296-9b6e-811ec88ce688","Type":"ContainerStarted","Data":"aa0142e65e9a5a5d872d45f27b8940ef690e6e4fd964d958a018804263ef5d6f"} Apr 16 18:19:03.356701 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:03.356665 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:19:03.356701 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:03.356707 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:19:03.361344 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:03.361319 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:19:03.834915 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:03.834886 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:19:09.122699 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.122652 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" podUID="efc53255-caaf-4b68-9cd3-c6118907f500" containerName="registry" containerID="cri-o://ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43" gracePeriod=30 Apr 16 18:19:09.357949 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.357925 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:19:09.384613 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.384580 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-registry-certificates\") pod \"efc53255-caaf-4b68-9cd3-c6118907f500\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " Apr 16 18:19:09.384825 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.384629 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh7xt\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-kube-api-access-qh7xt\") pod \"efc53255-caaf-4b68-9cd3-c6118907f500\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " Apr 16 18:19:09.384825 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.384654 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") pod \"efc53255-caaf-4b68-9cd3-c6118907f500\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " Apr 16 18:19:09.384825 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.384690 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-installation-pull-secrets\") pod \"efc53255-caaf-4b68-9cd3-c6118907f500\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " Apr 16 18:19:09.384825 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.384717 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-image-registry-private-configuration\") pod \"efc53255-caaf-4b68-9cd3-c6118907f500\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " Apr 16 18:19:09.384825 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.384743 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/efc53255-caaf-4b68-9cd3-c6118907f500-ca-trust-extracted\") pod \"efc53255-caaf-4b68-9cd3-c6118907f500\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " Apr 16 18:19:09.385124 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.384870 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-bound-sa-token\") pod \"efc53255-caaf-4b68-9cd3-c6118907f500\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " Apr 16 18:19:09.385124 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.384918 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-trusted-ca\") pod \"efc53255-caaf-4b68-9cd3-c6118907f500\" (UID: \"efc53255-caaf-4b68-9cd3-c6118907f500\") " Apr 16 18:19:09.385582 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.385554 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "efc53255-caaf-4b68-9cd3-c6118907f500" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:09.386221 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.386183 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "efc53255-caaf-4b68-9cd3-c6118907f500" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:09.388731 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.388683 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "efc53255-caaf-4b68-9cd3-c6118907f500" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:09.388909 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.388872 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "efc53255-caaf-4b68-9cd3-c6118907f500" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:09.388909 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.388881 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-kube-api-access-qh7xt" (OuterVolumeSpecName: "kube-api-access-qh7xt") pod "efc53255-caaf-4b68-9cd3-c6118907f500" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500"). InnerVolumeSpecName "kube-api-access-qh7xt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:09.389323 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.389295 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "efc53255-caaf-4b68-9cd3-c6118907f500" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:19:09.389419 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.389375 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "efc53255-caaf-4b68-9cd3-c6118907f500" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:19:09.395223 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.395196 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efc53255-caaf-4b68-9cd3-c6118907f500-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "efc53255-caaf-4b68-9cd3-c6118907f500" (UID: "efc53255-caaf-4b68-9cd3-c6118907f500"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:19:09.485835 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.485797 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-registry-certificates\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:19:09.485835 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.485829 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qh7xt\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-kube-api-access-qh7xt\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:19:09.485835 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.485840 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-registry-tls\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:19:09.486059 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.485849 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-installation-pull-secrets\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:19:09.486059 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.485860 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/efc53255-caaf-4b68-9cd3-c6118907f500-image-registry-private-configuration\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:19:09.486059 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.485869 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/efc53255-caaf-4b68-9cd3-c6118907f500-ca-trust-extracted\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:19:09.486059 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.485878 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efc53255-caaf-4b68-9cd3-c6118907f500-bound-sa-token\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:19:09.486059 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.485886 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efc53255-caaf-4b68-9cd3-c6118907f500-trusted-ca\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:19:09.848112 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.848020 2573 generic.go:358] "Generic (PLEG): container finished" podID="efc53255-caaf-4b68-9cd3-c6118907f500" containerID="ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43" exitCode=0 Apr 16 18:19:09.848112 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.848104 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" Apr 16 18:19:09.848310 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.848104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" event={"ID":"efc53255-caaf-4b68-9cd3-c6118907f500","Type":"ContainerDied","Data":"ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43"} Apr 16 18:19:09.848310 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.848142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fff8cc5c4-dk27h" event={"ID":"efc53255-caaf-4b68-9cd3-c6118907f500","Type":"ContainerDied","Data":"a42a9f74d9d540446140c08cab9d810b6092e61fbb1dcac5a5d0b7b4404c8d6a"} Apr 16 18:19:09.848310 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.848158 2573 scope.go:117] "RemoveContainer" containerID="ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43" Apr 16 18:19:09.856820 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.856805 2573 scope.go:117] "RemoveContainer" containerID="ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43" Apr 16 18:19:09.857088 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:19:09.857067 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43\": container with ID starting with ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43 not found: ID does not exist" containerID="ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43" Apr 16 18:19:09.857131 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.857097 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43"} err="failed to get container status \"ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43\": rpc error: code = NotFound desc = could not find container \"ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43\": container with ID starting with ff50af3f65f53cdd00d79f1a40c3f4c6f97d699a102f9c6b7bd3017a6701fc43 not found: ID does not exist" Apr 16 18:19:09.866904 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.866875 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7fff8cc5c4-dk27h"] Apr 16 18:19:09.870858 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:09.870838 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7fff8cc5c4-dk27h"] Apr 16 18:19:11.221615 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:11.221576 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc53255-caaf-4b68-9cd3-c6118907f500" path="/var/lib/kubelet/pods/efc53255-caaf-4b68-9cd3-c6118907f500/volumes" Apr 16 18:19:41.155664 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:41.155622 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:19:41.158107 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:41.158087 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fd8f3ce-1a67-4a38-99ec-e368aea03088-metrics-certs\") pod \"network-metrics-daemon-j9gkk\" (UID: \"3fd8f3ce-1a67-4a38-99ec-e368aea03088\") " pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:19:41.422919 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:41.422840 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4plx6\"" Apr 16 18:19:41.431379 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:41.431358 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9gkk" Apr 16 18:19:41.562289 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:41.562255 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j9gkk"] Apr 16 18:19:41.565273 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:19:41.565241 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fd8f3ce_1a67_4a38_99ec_e368aea03088.slice/crio-a1d2a46898b4360942f622d102fd528eb93d85eff9845e84662047ae07ce32e8 WatchSource:0}: Error finding container a1d2a46898b4360942f622d102fd528eb93d85eff9845e84662047ae07ce32e8: Status 404 returned error can't find the container with id a1d2a46898b4360942f622d102fd528eb93d85eff9845e84662047ae07ce32e8 Apr 16 18:19:41.938750 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:41.938709 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j9gkk" event={"ID":"3fd8f3ce-1a67-4a38-99ec-e368aea03088","Type":"ContainerStarted","Data":"a1d2a46898b4360942f622d102fd528eb93d85eff9845e84662047ae07ce32e8"} Apr 16 18:19:42.943426 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:42.943391 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j9gkk" event={"ID":"3fd8f3ce-1a67-4a38-99ec-e368aea03088","Type":"ContainerStarted","Data":"1e7beb7390cd95713e8b82a01907174b181523d604c2507f1d6ab53320beb7ab"} Apr 16 18:19:42.943426 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:42.943428 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j9gkk" event={"ID":"3fd8f3ce-1a67-4a38-99ec-e368aea03088","Type":"ContainerStarted","Data":"89019315dc80a6d84a943df5f9d86934fab9acd9564acd01ce77c3146d74775f"} Apr 16 18:19:42.962673 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:42.962597 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j9gkk" podStartSLOduration=253.029065805 podStartE2EDuration="4m13.962575061s" podCreationTimestamp="2026-04-16 18:15:29 +0000 UTC" firstStartedPulling="2026-04-16 18:19:41.567208239 +0000 UTC m=+252.923350274" lastFinishedPulling="2026-04-16 18:19:42.500717495 +0000 UTC m=+253.856859530" observedRunningTime="2026-04-16 18:19:42.960454086 +0000 UTC m=+254.316596144" watchObservedRunningTime="2026-04-16 18:19:42.962575061 +0000 UTC m=+254.318717119" Apr 16 18:19:53.587504 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.587415 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f75b9848f-g9vnt"] Apr 16 18:19:53.588056 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.587924 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efc53255-caaf-4b68-9cd3-c6118907f500" containerName="registry" Apr 16 18:19:53.588056 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.587943 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc53255-caaf-4b68-9cd3-c6118907f500" containerName="registry" Apr 16 18:19:53.588056 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.588037 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="efc53255-caaf-4b68-9cd3-c6118907f500" containerName="registry" Apr 16 18:19:53.590937 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.590915 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.602748 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.602715 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f75b9848f-g9vnt"] Apr 16 18:19:53.663975 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.663939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-trusted-ca-bundle\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.663975 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.663977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-serving-cert\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.664193 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.664007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn6kj\" (UniqueName: \"kubernetes.io/projected/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-kube-api-access-jn6kj\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.664193 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.664057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-oauth-config\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.664193 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.664141 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-config\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.664193 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.664158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-service-ca\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.664193 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.664179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-oauth-serving-cert\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.764973 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.764933 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jn6kj\" (UniqueName: \"kubernetes.io/projected/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-kube-api-access-jn6kj\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.765173 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.764984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-oauth-config\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.765173 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.765067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-config\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.765173 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.765084 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-service-ca\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.765173 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.765110 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-oauth-serving-cert\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.765173 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.765142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-trusted-ca-bundle\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.765415 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.765176 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-serving-cert\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.766077 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.766046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-config\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.766216 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.766067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-service-ca\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.766216 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.766128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-trusted-ca-bundle\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.766303 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.766254 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-oauth-serving-cert\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.767885 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.767856 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-oauth-config\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.767980 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.767918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-serving-cert\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.772409 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.772385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn6kj\" (UniqueName: \"kubernetes.io/projected/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-kube-api-access-jn6kj\") pod \"console-f75b9848f-g9vnt\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:53.900767 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:53.900725 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:19:54.044028 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:54.043990 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f75b9848f-g9vnt"] Apr 16 18:19:54.047880 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:19:54.047831 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84e18b78_2c41_49e4_8c8a_f6bb9b134d7a.slice/crio-c0ce560a996676839c29825b1b5097a6140a24cad8a1327d7ad3f995b892d9d2 WatchSource:0}: Error finding container c0ce560a996676839c29825b1b5097a6140a24cad8a1327d7ad3f995b892d9d2: Status 404 returned error can't find the container with id c0ce560a996676839c29825b1b5097a6140a24cad8a1327d7ad3f995b892d9d2 Apr 16 18:19:54.982476 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:54.982440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f75b9848f-g9vnt" event={"ID":"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a","Type":"ContainerStarted","Data":"6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769"} Apr 16 18:19:54.982476 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:54.982474 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f75b9848f-g9vnt" event={"ID":"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a","Type":"ContainerStarted","Data":"c0ce560a996676839c29825b1b5097a6140a24cad8a1327d7ad3f995b892d9d2"} Apr 16 18:19:55.001383 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:19:55.001313 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f75b9848f-g9vnt" podStartSLOduration=2.00129591 podStartE2EDuration="2.00129591s" podCreationTimestamp="2026-04-16 18:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:55.000664605 +0000 UTC m=+266.356806670" watchObservedRunningTime="2026-04-16 18:19:55.00129591 +0000 UTC m=+266.357437967" Apr 16 18:20:03.901731 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:03.901679 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:20:03.901731 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:03.901741 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:20:03.906476 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:03.906452 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:20:04.012631 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:04.012595 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:20:04.070538 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:04.070491 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7946f4fd7c-7222d"] Apr 16 18:20:08.620169 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:20:08.620120 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" podUID="71df2187-c914-4b58-8d61-6fcaacaefd11" Apr 16 18:20:09.022011 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:09.021984 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:20:12.531498 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:12.531453 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:20:12.531928 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:12.531564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:20:12.534262 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:12.534223 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d155535-fa59-4777-80cf-fdba34134958-cert\") pod \"ingress-canary-7vbzm\" (UID: \"8d155535-fa59-4777-80cf-fdba34134958\") " pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:20:12.534377 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:12.534223 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71df2187-c914-4b58-8d61-6fcaacaefd11-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-h9xxg\" (UID: \"71df2187-c914-4b58-8d61-6fcaacaefd11\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:20:12.621176 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:12.621147 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4scgb\"" Apr 16 18:20:12.624112 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:12.624094 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-97dhh\"" Apr 16 18:20:12.628349 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:12.628328 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7vbzm" Apr 16 18:20:12.633120 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:12.633101 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" Apr 16 18:20:12.761039 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:12.760892 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7vbzm"] Apr 16 18:20:12.763814 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:20:12.763787 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d155535_fa59_4777_80cf_fdba34134958.slice/crio-f305f9f74060918bb096fe830af681387d051faff2c47e7560533e2a5f14b8c7 WatchSource:0}: Error finding container f305f9f74060918bb096fe830af681387d051faff2c47e7560533e2a5f14b8c7: Status 404 returned error can't find the container with id f305f9f74060918bb096fe830af681387d051faff2c47e7560533e2a5f14b8c7 Apr 16 18:20:12.781669 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:12.781610 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg"] Apr 16 18:20:12.784510 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:20:12.784487 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71df2187_c914_4b58_8d61_6fcaacaefd11.slice/crio-50b3eed871083a6d20a4fd0cb762430591bf013707fa6bde4dc06003648a437a WatchSource:0}: Error finding container 50b3eed871083a6d20a4fd0cb762430591bf013707fa6bde4dc06003648a437a: Status 404 returned error can't find the container with id 50b3eed871083a6d20a4fd0cb762430591bf013707fa6bde4dc06003648a437a Apr 16 18:20:13.034896 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:13.034810 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" event={"ID":"71df2187-c914-4b58-8d61-6fcaacaefd11","Type":"ContainerStarted","Data":"50b3eed871083a6d20a4fd0cb762430591bf013707fa6bde4dc06003648a437a"} Apr 16 18:20:13.035737 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:13.035716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7vbzm" event={"ID":"8d155535-fa59-4777-80cf-fdba34134958","Type":"ContainerStarted","Data":"f305f9f74060918bb096fe830af681387d051faff2c47e7560533e2a5f14b8c7"} Apr 16 18:20:15.043600 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:15.043558 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" event={"ID":"71df2187-c914-4b58-8d61-6fcaacaefd11","Type":"ContainerStarted","Data":"95c194ebe1a12b884d83fba208206554c30857e66d481cbc6b86df45f326f2f3"} Apr 16 18:20:15.044917 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:15.044894 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7vbzm" event={"ID":"8d155535-fa59-4777-80cf-fdba34134958","Type":"ContainerStarted","Data":"cce5a98ca959c0ad491381d28cc10a5afe1ac0a4af6cb5f387048554152963d5"} Apr 16 18:20:15.065405 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:15.065348 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-h9xxg" podStartSLOduration=273.345333701 podStartE2EDuration="4m35.065332324s" podCreationTimestamp="2026-04-16 18:15:40 +0000 UTC" firstStartedPulling="2026-04-16 18:20:12.786399676 +0000 UTC m=+284.142541711" lastFinishedPulling="2026-04-16 18:20:14.506398299 +0000 UTC m=+285.862540334" observedRunningTime="2026-04-16 18:20:15.064400366 +0000 UTC m=+286.420542422" watchObservedRunningTime="2026-04-16 18:20:15.065332324 +0000 UTC m=+286.421474383" Apr 16 18:20:15.084192 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:15.084134 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7vbzm" podStartSLOduration=251.339215653 podStartE2EDuration="4m13.084112261s" podCreationTimestamp="2026-04-16 18:16:02 +0000 UTC" firstStartedPulling="2026-04-16 18:20:12.765742696 +0000 UTC m=+284.121884731" lastFinishedPulling="2026-04-16 18:20:14.510639302 +0000 UTC m=+285.866781339" observedRunningTime="2026-04-16 18:20:15.083641835 +0000 UTC m=+286.439783893" watchObservedRunningTime="2026-04-16 18:20:15.084112261 +0000 UTC m=+286.440254324" Apr 16 18:20:29.091372 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.091315 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7946f4fd7c-7222d" podUID="531e8f1d-a948-4acd-877e-2e55ec956c36" containerName="console" containerID="cri-o://5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1" gracePeriod=15 Apr 16 18:20:29.122331 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.122306 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:20:29.338024 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.338004 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7946f4fd7c-7222d_531e8f1d-a948-4acd-877e-2e55ec956c36/console/0.log" Apr 16 18:20:29.338167 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.338063 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:20:29.475435 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.475400 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-oauth-serving-cert\") pod \"531e8f1d-a948-4acd-877e-2e55ec956c36\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " Apr 16 18:20:29.475637 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.475454 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-serving-cert\") pod \"531e8f1d-a948-4acd-877e-2e55ec956c36\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " Apr 16 18:20:29.475637 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.475488 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-oauth-config\") pod \"531e8f1d-a948-4acd-877e-2e55ec956c36\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " Apr 16 18:20:29.475637 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.475577 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-console-config\") pod \"531e8f1d-a948-4acd-877e-2e55ec956c36\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " Apr 16 18:20:29.475637 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.475629 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-trusted-ca-bundle\") pod \"531e8f1d-a948-4acd-877e-2e55ec956c36\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " Apr 16 18:20:29.475837 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.475664 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q5m9\" (UniqueName: \"kubernetes.io/projected/531e8f1d-a948-4acd-877e-2e55ec956c36-kube-api-access-6q5m9\") pod \"531e8f1d-a948-4acd-877e-2e55ec956c36\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " Apr 16 18:20:29.475837 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.475718 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-service-ca\") pod \"531e8f1d-a948-4acd-877e-2e55ec956c36\" (UID: \"531e8f1d-a948-4acd-877e-2e55ec956c36\") " Apr 16 18:20:29.475942 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.475812 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "531e8f1d-a948-4acd-877e-2e55ec956c36" (UID: "531e8f1d-a948-4acd-877e-2e55ec956c36"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:29.476033 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.476004 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-console-config" (OuterVolumeSpecName: "console-config") pod "531e8f1d-a948-4acd-877e-2e55ec956c36" (UID: "531e8f1d-a948-4acd-877e-2e55ec956c36"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:29.476033 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.476018 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "531e8f1d-a948-4acd-877e-2e55ec956c36" (UID: "531e8f1d-a948-4acd-877e-2e55ec956c36"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:29.476234 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.476204 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-service-ca" (OuterVolumeSpecName: "service-ca") pod "531e8f1d-a948-4acd-877e-2e55ec956c36" (UID: "531e8f1d-a948-4acd-877e-2e55ec956c36"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:29.478226 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.478194 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/531e8f1d-a948-4acd-877e-2e55ec956c36-kube-api-access-6q5m9" (OuterVolumeSpecName: "kube-api-access-6q5m9") pod "531e8f1d-a948-4acd-877e-2e55ec956c36" (UID: "531e8f1d-a948-4acd-877e-2e55ec956c36"). InnerVolumeSpecName "kube-api-access-6q5m9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:29.478329 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.478307 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "531e8f1d-a948-4acd-877e-2e55ec956c36" (UID: "531e8f1d-a948-4acd-877e-2e55ec956c36"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:29.478379 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.478332 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "531e8f1d-a948-4acd-877e-2e55ec956c36" (UID: "531e8f1d-a948-4acd-877e-2e55ec956c36"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:29.576849 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.576791 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-oauth-config\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:20:29.576849 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.576846 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-console-config\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:20:29.576849 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.576857 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-trusted-ca-bundle\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:20:29.576849 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.576867 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6q5m9\" (UniqueName: \"kubernetes.io/projected/531e8f1d-a948-4acd-877e-2e55ec956c36-kube-api-access-6q5m9\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:20:29.577110 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.576876 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-service-ca\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:20:29.577110 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.576885 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/531e8f1d-a948-4acd-877e-2e55ec956c36-oauth-serving-cert\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:20:29.577110 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:29.576893 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/531e8f1d-a948-4acd-877e-2e55ec956c36-console-serving-cert\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:20:30.088458 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:30.088432 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7946f4fd7c-7222d_531e8f1d-a948-4acd-877e-2e55ec956c36/console/0.log" Apr 16 18:20:30.088618 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:30.088471 2573 generic.go:358] "Generic (PLEG): container finished" podID="531e8f1d-a948-4acd-877e-2e55ec956c36" containerID="5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1" exitCode=2 Apr 16 18:20:30.088618 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:30.088506 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7946f4fd7c-7222d" event={"ID":"531e8f1d-a948-4acd-877e-2e55ec956c36","Type":"ContainerDied","Data":"5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1"} Apr 16 18:20:30.088618 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:30.088546 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7946f4fd7c-7222d" event={"ID":"531e8f1d-a948-4acd-877e-2e55ec956c36","Type":"ContainerDied","Data":"5bd7fbadf74398c8c6d8c82a69bc454c5f57abb6d346638fc603d5a8d12b071d"} Apr 16 18:20:30.088618 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:30.088565 2573 scope.go:117] "RemoveContainer" containerID="5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1" Apr 16 18:20:30.088797 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:30.088565 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7946f4fd7c-7222d" Apr 16 18:20:30.101270 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:30.101088 2573 scope.go:117] "RemoveContainer" containerID="5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1" Apr 16 18:20:30.101495 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:20:30.101354 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1\": container with ID starting with 5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1 not found: ID does not exist" containerID="5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1" Apr 16 18:20:30.101495 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:30.101379 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1"} err="failed to get container status \"5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1\": rpc error: code = NotFound desc = could not find container \"5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1\": container with ID starting with 5b2f8523d6f23cc49157f949bada3db2545d3b6abbdd3481f44c9628bdfc14b1 not found: ID does not exist" Apr 16 18:20:30.112397 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:30.112374 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7946f4fd7c-7222d"] Apr 16 18:20:30.119172 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:30.119154 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7946f4fd7c-7222d"] Apr 16 18:20:31.221337 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:31.221301 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="531e8f1d-a948-4acd-877e-2e55ec956c36" path="/var/lib/kubelet/pods/531e8f1d-a948-4acd-877e-2e55ec956c36/volumes" Apr 16 18:20:45.826445 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.826405 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk"] Apr 16 18:20:45.826984 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.826864 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="531e8f1d-a948-4acd-877e-2e55ec956c36" containerName="console" Apr 16 18:20:45.826984 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.826882 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="531e8f1d-a948-4acd-877e-2e55ec956c36" containerName="console" Apr 16 18:20:45.826984 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.826964 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="531e8f1d-a948-4acd-877e-2e55ec956c36" containerName="console" Apr 16 18:20:45.831624 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.831601 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:20:45.834304 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.834274 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sfxb4\"" Apr 16 18:20:45.834633 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.834616 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:20:45.834846 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.834830 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:20:45.842347 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.842321 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk"] Apr 16 18:20:45.903340 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.903305 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:20:45.903340 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.903347 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:20:45.903577 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:45.903431 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cljgn\" (UniqueName: \"kubernetes.io/projected/331be663-8b13-4181-8f82-135bd34b4590-kube-api-access-cljgn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:20:46.003805 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:46.003770 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cljgn\" (UniqueName: \"kubernetes.io/projected/331be663-8b13-4181-8f82-135bd34b4590-kube-api-access-cljgn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:20:46.003919 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:46.003818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:20:46.003919 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:46.003856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:20:46.004265 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:46.004247 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:20:46.004302 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:46.004283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:20:46.012749 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:46.012724 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cljgn\" (UniqueName: \"kubernetes.io/projected/331be663-8b13-4181-8f82-135bd34b4590-kube-api-access-cljgn\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:20:46.140869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:46.140844 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:20:46.267259 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:46.267222 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk"] Apr 16 18:20:46.270457 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:20:46.270431 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod331be663_8b13_4181_8f82_135bd34b4590.slice/crio-65b9ea76a61f55501f20d64e293786376df24938211c271f3c33b4dcf942c0d5 WatchSource:0}: Error finding container 65b9ea76a61f55501f20d64e293786376df24938211c271f3c33b4dcf942c0d5: Status 404 returned error can't find the container with id 65b9ea76a61f55501f20d64e293786376df24938211c271f3c33b4dcf942c0d5 Apr 16 18:20:46.272830 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:46.272804 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:20:47.140692 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:47.140652 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" event={"ID":"331be663-8b13-4181-8f82-135bd34b4590","Type":"ContainerStarted","Data":"65b9ea76a61f55501f20d64e293786376df24938211c271f3c33b4dcf942c0d5"} Apr 16 18:20:51.154803 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:51.154763 2573 generic.go:358] "Generic (PLEG): container finished" podID="331be663-8b13-4181-8f82-135bd34b4590" containerID="0d11c0a714e77c7dde2c49a9219d604d673a40864653c49e254131a9a31cdb05" exitCode=0 Apr 16 18:20:51.155178 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:51.154858 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" event={"ID":"331be663-8b13-4181-8f82-135bd34b4590","Type":"ContainerDied","Data":"0d11c0a714e77c7dde2c49a9219d604d673a40864653c49e254131a9a31cdb05"} Apr 16 18:20:53.162580 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:53.162544 2573 generic.go:358] "Generic (PLEG): container finished" podID="331be663-8b13-4181-8f82-135bd34b4590" containerID="df05b4d3210da059db067039d0c4ce69e3ee71d7f5683acdbd8671b7d0f0d423" exitCode=0 Apr 16 18:20:53.163033 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:53.162607 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" event={"ID":"331be663-8b13-4181-8f82-135bd34b4590","Type":"ContainerDied","Data":"df05b4d3210da059db067039d0c4ce69e3ee71d7f5683acdbd8671b7d0f0d423"} Apr 16 18:20:59.184315 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:59.184280 2573 generic.go:358] "Generic (PLEG): container finished" podID="331be663-8b13-4181-8f82-135bd34b4590" containerID="0deed12dcc0480ae644f94e0c71b2df17ee16066507820d312b9c06fcbad68ba" exitCode=0 Apr 16 18:20:59.184690 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:20:59.184339 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" event={"ID":"331be663-8b13-4181-8f82-135bd34b4590","Type":"ContainerDied","Data":"0deed12dcc0480ae644f94e0c71b2df17ee16066507820d312b9c06fcbad68ba"} Apr 16 18:21:00.307875 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:00.307850 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:21:00.435014 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:00.434976 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-bundle\") pod \"331be663-8b13-4181-8f82-135bd34b4590\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " Apr 16 18:21:00.435166 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:00.435075 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-util\") pod \"331be663-8b13-4181-8f82-135bd34b4590\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " Apr 16 18:21:00.435166 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:00.435118 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cljgn\" (UniqueName: \"kubernetes.io/projected/331be663-8b13-4181-8f82-135bd34b4590-kube-api-access-cljgn\") pod \"331be663-8b13-4181-8f82-135bd34b4590\" (UID: \"331be663-8b13-4181-8f82-135bd34b4590\") " Apr 16 18:21:00.435625 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:00.435602 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-bundle" (OuterVolumeSpecName: "bundle") pod "331be663-8b13-4181-8f82-135bd34b4590" (UID: "331be663-8b13-4181-8f82-135bd34b4590"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:21:00.437556 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:00.437481 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331be663-8b13-4181-8f82-135bd34b4590-kube-api-access-cljgn" (OuterVolumeSpecName: "kube-api-access-cljgn") pod "331be663-8b13-4181-8f82-135bd34b4590" (UID: "331be663-8b13-4181-8f82-135bd34b4590"). InnerVolumeSpecName "kube-api-access-cljgn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:00.439860 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:00.439833 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-util" (OuterVolumeSpecName: "util") pod "331be663-8b13-4181-8f82-135bd34b4590" (UID: "331be663-8b13-4181-8f82-135bd34b4590"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:21:00.536243 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:00.536185 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-util\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:21:00.536243 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:00.536236 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cljgn\" (UniqueName: \"kubernetes.io/projected/331be663-8b13-4181-8f82-135bd34b4590-kube-api-access-cljgn\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:21:00.536243 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:00.536248 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/331be663-8b13-4181-8f82-135bd34b4590-bundle\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:21:01.191883 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:01.191842 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" event={"ID":"331be663-8b13-4181-8f82-135bd34b4590","Type":"ContainerDied","Data":"65b9ea76a61f55501f20d64e293786376df24938211c271f3c33b4dcf942c0d5"} Apr 16 18:21:01.191883 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:01.191878 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmz2xk" Apr 16 18:21:01.192082 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:01.191876 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65b9ea76a61f55501f20d64e293786376df24938211c271f3c33b4dcf942c0d5" Apr 16 18:21:07.639035 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.638992 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf"] Apr 16 18:21:07.639408 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.639312 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="331be663-8b13-4181-8f82-135bd34b4590" containerName="util" Apr 16 18:21:07.639408 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.639325 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="331be663-8b13-4181-8f82-135bd34b4590" containerName="util" Apr 16 18:21:07.639408 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.639332 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="331be663-8b13-4181-8f82-135bd34b4590" containerName="extract" Apr 16 18:21:07.639408 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.639337 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="331be663-8b13-4181-8f82-135bd34b4590" containerName="extract" Apr 16 18:21:07.639408 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.639345 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="331be663-8b13-4181-8f82-135bd34b4590" containerName="pull" Apr 16 18:21:07.639408 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.639350 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="331be663-8b13-4181-8f82-135bd34b4590" containerName="pull" Apr 16 18:21:07.639408 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.639398 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="331be663-8b13-4181-8f82-135bd34b4590" containerName="extract" Apr 16 18:21:07.643045 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.643025 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" Apr 16 18:21:07.645493 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.645466 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:21:07.645650 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.645614 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-qt64j\"" Apr 16 18:21:07.645894 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.645877 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:21:07.646002 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.645985 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:21:07.655490 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.655468 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf"] Apr 16 18:21:07.799575 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.799488 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5dj6\" (UniqueName: \"kubernetes.io/projected/66674989-a42b-4888-b104-140c9877719f-kube-api-access-q5dj6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-clvjf\" (UID: \"66674989-a42b-4888-b104-140c9877719f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" Apr 16 18:21:07.799575 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.799574 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/66674989-a42b-4888-b104-140c9877719f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-clvjf\" (UID: \"66674989-a42b-4888-b104-140c9877719f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" Apr 16 18:21:07.900578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.900465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5dj6\" (UniqueName: \"kubernetes.io/projected/66674989-a42b-4888-b104-140c9877719f-kube-api-access-q5dj6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-clvjf\" (UID: \"66674989-a42b-4888-b104-140c9877719f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" Apr 16 18:21:07.900578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.900547 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/66674989-a42b-4888-b104-140c9877719f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-clvjf\" (UID: \"66674989-a42b-4888-b104-140c9877719f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" Apr 16 18:21:07.903185 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.903160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/66674989-a42b-4888-b104-140c9877719f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-clvjf\" (UID: \"66674989-a42b-4888-b104-140c9877719f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" Apr 16 18:21:07.921050 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.921003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5dj6\" (UniqueName: \"kubernetes.io/projected/66674989-a42b-4888-b104-140c9877719f-kube-api-access-q5dj6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-clvjf\" (UID: \"66674989-a42b-4888-b104-140c9877719f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" Apr 16 18:21:07.953013 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:07.952979 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" Apr 16 18:21:08.082287 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:08.082261 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf"] Apr 16 18:21:08.084925 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:21:08.084897 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66674989_a42b_4888_b104_140c9877719f.slice/crio-c8c9b622b0c87a45fd2f57ef4c9a1d7700440f9c1b43b99a2788ebf699411717 WatchSource:0}: Error finding container c8c9b622b0c87a45fd2f57ef4c9a1d7700440f9c1b43b99a2788ebf699411717: Status 404 returned error can't find the container with id c8c9b622b0c87a45fd2f57ef4c9a1d7700440f9c1b43b99a2788ebf699411717 Apr 16 18:21:08.215405 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:08.215318 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" event={"ID":"66674989-a42b-4888-b104-140c9877719f","Type":"ContainerStarted","Data":"c8c9b622b0c87a45fd2f57ef4c9a1d7700440f9c1b43b99a2788ebf699411717"} Apr 16 18:21:11.227338 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.227298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" event={"ID":"66674989-a42b-4888-b104-140c9877719f","Type":"ContainerStarted","Data":"3a8247008fc639d357775ae9f7d0a8d00518b659e7db5f3879bb4b612d26b6e5"} Apr 16 18:21:11.227739 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.227360 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" Apr 16 18:21:11.246828 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.246689 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" podStartSLOduration=1.204306689 podStartE2EDuration="4.246673792s" podCreationTimestamp="2026-04-16 18:21:07 +0000 UTC" firstStartedPulling="2026-04-16 18:21:08.086581138 +0000 UTC m=+339.442723173" lastFinishedPulling="2026-04-16 18:21:11.128948238 +0000 UTC m=+342.485090276" observedRunningTime="2026-04-16 18:21:11.244869565 +0000 UTC m=+342.601011625" watchObservedRunningTime="2026-04-16 18:21:11.246673792 +0000 UTC m=+342.602815851" Apr 16 18:21:11.655338 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.655304 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rv7xh"] Apr 16 18:21:11.657647 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.657620 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:11.659594 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.659573 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:21:11.659707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.659612 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:21:11.659878 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.659862 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-jqqkh\"" Apr 16 18:21:11.669389 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.669363 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rv7xh"] Apr 16 18:21:11.736988 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.736951 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:11.736988 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.736992 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpgw5\" (UniqueName: \"kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-kube-api-access-bpgw5\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:11.737209 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.737044 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/8f94957e-b897-4ba6-8ad5-e33daed0e799-cabundle0\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:11.838276 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.838239 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/8f94957e-b897-4ba6-8ad5-e33daed0e799-cabundle0\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:11.838468 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.838310 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:11.838468 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.838328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpgw5\" (UniqueName: \"kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-kube-api-access-bpgw5\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:11.838468 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:11.838428 2573 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 18:21:11.838468 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:11.838453 2573 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:21:11.838468 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:11.838462 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:21:11.838701 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:11.838478 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rv7xh: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 18:21:11.838701 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:11.838566 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates podName:8f94957e-b897-4ba6-8ad5-e33daed0e799 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:12.338545614 +0000 UTC m=+343.694687662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates") pod "keda-operator-ffbb595cb-rv7xh" (UID: "8f94957e-b897-4ba6-8ad5-e33daed0e799") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 18:21:11.839003 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.838984 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/8f94957e-b897-4ba6-8ad5-e33daed0e799-cabundle0\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:11.847881 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:11.847853 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpgw5\" (UniqueName: \"kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-kube-api-access-bpgw5\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:12.002423 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.002334 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b"] Apr 16 18:21:12.004689 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.004673 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:12.006702 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.006675 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:21:12.013538 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.013494 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b"] Apr 16 18:21:12.140547 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.140482 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:12.140714 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.140572 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km9rj\" (UniqueName: \"kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-kube-api-access-km9rj\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:12.140714 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.140664 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:12.241726 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.241687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:12.242187 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.241775 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:12.242187 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.241823 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-km9rj\" (UniqueName: \"kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-kube-api-access-km9rj\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:12.242187 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.241914 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:21:12.242187 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.241931 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:21:12.242187 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.241953 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b: references non-existent secret key: tls.crt Apr 16 18:21:12.242187 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.242004 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates podName:1bfff43a-ffe7-4727-89cd-3ea3a3d11828 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:12.74198778 +0000 UTC m=+344.098129818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates") pod "keda-metrics-apiserver-7c9f485588-kc27b" (UID: "1bfff43a-ffe7-4727-89cd-3ea3a3d11828") : references non-existent secret key: tls.crt Apr 16 18:21:12.242187 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.242132 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:12.254301 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.254224 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-pptbf"] Apr 16 18:21:12.256775 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.256758 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-pptbf" Apr 16 18:21:12.259444 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.259422 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:21:12.269535 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.269489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-km9rj\" (UniqueName: \"kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-kube-api-access-km9rj\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:12.278715 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.278692 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-pptbf"] Apr 16 18:21:12.343234 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.343198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:12.343435 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.343253 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/521f0201-890e-4441-9e59-6a968b7643fb-certificates\") pod \"keda-admission-cf49989db-pptbf\" (UID: \"521f0201-890e-4441-9e59-6a968b7643fb\") " pod="openshift-keda/keda-admission-cf49989db-pptbf" Apr 16 18:21:12.343435 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.343291 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtzxn\" (UniqueName: \"kubernetes.io/projected/521f0201-890e-4441-9e59-6a968b7643fb-kube-api-access-qtzxn\") pod \"keda-admission-cf49989db-pptbf\" (UID: \"521f0201-890e-4441-9e59-6a968b7643fb\") " pod="openshift-keda/keda-admission-cf49989db-pptbf" Apr 16 18:21:12.343435 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.343348 2573 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:21:12.343435 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.343371 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:21:12.343435 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.343384 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rv7xh: references non-existent secret key: ca.crt Apr 16 18:21:12.343636 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.343442 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates podName:8f94957e-b897-4ba6-8ad5-e33daed0e799 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:13.343424596 +0000 UTC m=+344.699566635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates") pod "keda-operator-ffbb595cb-rv7xh" (UID: "8f94957e-b897-4ba6-8ad5-e33daed0e799") : references non-existent secret key: ca.crt Apr 16 18:21:12.444485 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.444451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/521f0201-890e-4441-9e59-6a968b7643fb-certificates\") pod \"keda-admission-cf49989db-pptbf\" (UID: \"521f0201-890e-4441-9e59-6a968b7643fb\") " pod="openshift-keda/keda-admission-cf49989db-pptbf" Apr 16 18:21:12.444485 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.444491 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzxn\" (UniqueName: \"kubernetes.io/projected/521f0201-890e-4441-9e59-6a968b7643fb-kube-api-access-qtzxn\") pod \"keda-admission-cf49989db-pptbf\" (UID: \"521f0201-890e-4441-9e59-6a968b7643fb\") " pod="openshift-keda/keda-admission-cf49989db-pptbf" Apr 16 18:21:12.447206 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.447171 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/521f0201-890e-4441-9e59-6a968b7643fb-certificates\") pod \"keda-admission-cf49989db-pptbf\" (UID: \"521f0201-890e-4441-9e59-6a968b7643fb\") " pod="openshift-keda/keda-admission-cf49989db-pptbf" Apr 16 18:21:12.456011 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.455976 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtzxn\" (UniqueName: \"kubernetes.io/projected/521f0201-890e-4441-9e59-6a968b7643fb-kube-api-access-qtzxn\") pod \"keda-admission-cf49989db-pptbf\" (UID: \"521f0201-890e-4441-9e59-6a968b7643fb\") " pod="openshift-keda/keda-admission-cf49989db-pptbf" Apr 16 18:21:12.569760 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.569666 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-pptbf" Apr 16 18:21:12.739778 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.739714 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-pptbf"] Apr 16 18:21:12.742300 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:21:12.742270 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod521f0201_890e_4441_9e59_6a968b7643fb.slice/crio-de71a9e4912c5001f504a2f1080c791efaf6338197b3af2659b144a06c3b2fb2 WatchSource:0}: Error finding container de71a9e4912c5001f504a2f1080c791efaf6338197b3af2659b144a06c3b2fb2: Status 404 returned error can't find the container with id de71a9e4912c5001f504a2f1080c791efaf6338197b3af2659b144a06c3b2fb2 Apr 16 18:21:12.747840 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:12.747811 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:12.747968 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.747951 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:21:12.748026 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.747971 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:21:12.748026 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.747990 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b: references non-existent secret key: tls.crt Apr 16 18:21:12.748099 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:12.748042 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates podName:1bfff43a-ffe7-4727-89cd-3ea3a3d11828 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:13.748025157 +0000 UTC m=+345.104167214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates") pod "keda-metrics-apiserver-7c9f485588-kc27b" (UID: "1bfff43a-ffe7-4727-89cd-3ea3a3d11828") : references non-existent secret key: tls.crt Apr 16 18:21:13.233893 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:13.233858 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-pptbf" event={"ID":"521f0201-890e-4441-9e59-6a968b7643fb","Type":"ContainerStarted","Data":"de71a9e4912c5001f504a2f1080c791efaf6338197b3af2659b144a06c3b2fb2"} Apr 16 18:21:13.354395 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:13.354338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:13.354872 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:13.354465 2573 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:21:13.354872 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:13.354487 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:21:13.354872 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:13.354499 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rv7xh: references non-existent secret key: ca.crt Apr 16 18:21:13.354872 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:13.354578 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates podName:8f94957e-b897-4ba6-8ad5-e33daed0e799 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:15.354557701 +0000 UTC m=+346.710699741 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates") pod "keda-operator-ffbb595cb-rv7xh" (UID: "8f94957e-b897-4ba6-8ad5-e33daed0e799") : references non-existent secret key: ca.crt Apr 16 18:21:13.760263 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:13.759506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:13.760263 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:13.759830 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:21:13.760263 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:13.759866 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:21:13.760263 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:13.759889 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b: references non-existent secret key: tls.crt Apr 16 18:21:13.760263 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:13.759962 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates podName:1bfff43a-ffe7-4727-89cd-3ea3a3d11828 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:15.759944257 +0000 UTC m=+347.116086293 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates") pod "keda-metrics-apiserver-7c9f485588-kc27b" (UID: "1bfff43a-ffe7-4727-89cd-3ea3a3d11828") : references non-existent secret key: tls.crt Apr 16 18:21:14.238151 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:14.238118 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-pptbf" event={"ID":"521f0201-890e-4441-9e59-6a968b7643fb","Type":"ContainerStarted","Data":"f21e07cc909da2c68ec974c4f2bfaef19a9b79e551dd1220e5620a5968af84d6"} Apr 16 18:21:14.238304 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:14.238168 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-pptbf" Apr 16 18:21:14.256199 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:14.256145 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-pptbf" podStartSLOduration=0.926283784 podStartE2EDuration="2.256130205s" podCreationTimestamp="2026-04-16 18:21:12 +0000 UTC" firstStartedPulling="2026-04-16 18:21:12.743388831 +0000 UTC m=+344.099530866" lastFinishedPulling="2026-04-16 18:21:14.073235248 +0000 UTC m=+345.429377287" observedRunningTime="2026-04-16 18:21:14.254543387 +0000 UTC m=+345.610685444" watchObservedRunningTime="2026-04-16 18:21:14.256130205 +0000 UTC m=+345.612272260" Apr 16 18:21:15.374954 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:15.374915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:15.375391 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:15.375052 2573 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:21:15.375391 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:15.375069 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:21:15.375391 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:15.375082 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rv7xh: references non-existent secret key: ca.crt Apr 16 18:21:15.375391 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:21:15.375131 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates podName:8f94957e-b897-4ba6-8ad5-e33daed0e799 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:19.375117711 +0000 UTC m=+350.731259745 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates") pod "keda-operator-ffbb595cb-rv7xh" (UID: "8f94957e-b897-4ba6-8ad5-e33daed0e799") : references non-existent secret key: ca.crt Apr 16 18:21:15.778019 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:15.777908 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:15.780709 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:15.780680 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1bfff43a-ffe7-4727-89cd-3ea3a3d11828-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kc27b\" (UID: \"1bfff43a-ffe7-4727-89cd-3ea3a3d11828\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:15.917021 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:15.916967 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:16.044947 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:16.044836 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b"] Apr 16 18:21:16.049017 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:21:16.048989 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bfff43a_ffe7_4727_89cd_3ea3a3d11828.slice/crio-a39e68ae9d8439047075abfa8d08df74dadb6ef6c97a9427a808221788411daa WatchSource:0}: Error finding container a39e68ae9d8439047075abfa8d08df74dadb6ef6c97a9427a808221788411daa: Status 404 returned error can't find the container with id a39e68ae9d8439047075abfa8d08df74dadb6ef6c97a9427a808221788411daa Apr 16 18:21:16.245452 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:16.245412 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" event={"ID":"1bfff43a-ffe7-4727-89cd-3ea3a3d11828","Type":"ContainerStarted","Data":"a39e68ae9d8439047075abfa8d08df74dadb6ef6c97a9427a808221788411daa"} Apr 16 18:21:18.254745 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:18.254701 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" event={"ID":"1bfff43a-ffe7-4727-89cd-3ea3a3d11828","Type":"ContainerStarted","Data":"874784f894a5d9d96bd68599adea9ede31bcbf22b34da85cba223c23e7ad7c20"} Apr 16 18:21:18.255122 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:18.254836 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:18.274255 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:18.274200 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" podStartSLOduration=5.20342962 podStartE2EDuration="7.274185795s" podCreationTimestamp="2026-04-16 18:21:11 +0000 UTC" firstStartedPulling="2026-04-16 18:21:16.050350951 +0000 UTC m=+347.406492985" lastFinishedPulling="2026-04-16 18:21:18.121107123 +0000 UTC m=+349.477249160" observedRunningTime="2026-04-16 18:21:18.27256758 +0000 UTC m=+349.628709635" watchObservedRunningTime="2026-04-16 18:21:18.274185795 +0000 UTC m=+349.630327850" Apr 16 18:21:19.414117 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:19.414074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:19.416702 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:19.416672 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8f94957e-b897-4ba6-8ad5-e33daed0e799-certificates\") pod \"keda-operator-ffbb595cb-rv7xh\" (UID: \"8f94957e-b897-4ba6-8ad5-e33daed0e799\") " pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:19.467640 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:19.467593 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:19.587448 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:19.587392 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rv7xh"] Apr 16 18:21:19.590183 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:21:19.590154 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f94957e_b897_4ba6_8ad5_e33daed0e799.slice/crio-12c668c7ae63a2be561fb51be071648f0fd2bf7eabf6dd921ab1d7a65a624be4 WatchSource:0}: Error finding container 12c668c7ae63a2be561fb51be071648f0fd2bf7eabf6dd921ab1d7a65a624be4: Status 404 returned error can't find the container with id 12c668c7ae63a2be561fb51be071648f0fd2bf7eabf6dd921ab1d7a65a624be4 Apr 16 18:21:20.261736 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:20.261697 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" event={"ID":"8f94957e-b897-4ba6-8ad5-e33daed0e799","Type":"ContainerStarted","Data":"12c668c7ae63a2be561fb51be071648f0fd2bf7eabf6dd921ab1d7a65a624be4"} Apr 16 18:21:29.262272 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:29.262246 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kc27b" Apr 16 18:21:30.294252 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:30.294213 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" event={"ID":"8f94957e-b897-4ba6-8ad5-e33daed0e799","Type":"ContainerStarted","Data":"592314ddb5b2a856d74b8adab5b6ba0cc715c0cdc667b95bf9937b0394659aff"} Apr 16 18:21:30.294641 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:30.294271 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:21:30.310728 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:30.310685 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" podStartSLOduration=9.65779844 podStartE2EDuration="19.310669929s" podCreationTimestamp="2026-04-16 18:21:11 +0000 UTC" firstStartedPulling="2026-04-16 18:21:19.591567574 +0000 UTC m=+350.947709613" lastFinishedPulling="2026-04-16 18:21:29.244439067 +0000 UTC m=+360.600581102" observedRunningTime="2026-04-16 18:21:30.309892433 +0000 UTC m=+361.666034491" watchObservedRunningTime="2026-04-16 18:21:30.310669929 +0000 UTC m=+361.666811983" Apr 16 18:21:32.232048 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:32.232019 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-clvjf" Apr 16 18:21:35.243557 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:35.243501 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-pptbf" Apr 16 18:21:51.299669 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:21:51.299633 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-rv7xh" Apr 16 18:22:03.892463 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:03.892422 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv"] Apr 16 18:22:03.899074 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:03.899054 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:03.901203 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:03.901164 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:22:03.901503 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:03.901479 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:22:03.901503 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:03.901492 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sfxb4\"" Apr 16 18:22:03.903924 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:03.903903 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv"] Apr 16 18:22:03.982271 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:03.982235 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:03.982271 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:03.982279 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bw2w\" (UniqueName: \"kubernetes.io/projected/416805d0-d41a-426b-b6d7-fd86dd0a73a4-kube-api-access-2bw2w\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:03.982493 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:03.982304 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:04.083084 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:04.083051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:04.083084 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:04.083091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bw2w\" (UniqueName: \"kubernetes.io/projected/416805d0-d41a-426b-b6d7-fd86dd0a73a4-kube-api-access-2bw2w\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:04.083293 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:04.083115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:04.083433 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:04.083415 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:04.083475 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:04.083446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:04.091068 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:04.091047 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bw2w\" (UniqueName: \"kubernetes.io/projected/416805d0-d41a-426b-b6d7-fd86dd0a73a4-kube-api-access-2bw2w\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:04.209418 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:04.209316 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:04.333120 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:04.333087 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv"] Apr 16 18:22:04.336544 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:22:04.336499 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod416805d0_d41a_426b_b6d7_fd86dd0a73a4.slice/crio-405bb17bb86b5e3f7b0e72924859edfb408d224770d36e50bf164042bce0d3b0 WatchSource:0}: Error finding container 405bb17bb86b5e3f7b0e72924859edfb408d224770d36e50bf164042bce0d3b0: Status 404 returned error can't find the container with id 405bb17bb86b5e3f7b0e72924859edfb408d224770d36e50bf164042bce0d3b0 Apr 16 18:22:04.408452 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:04.408403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" event={"ID":"416805d0-d41a-426b-b6d7-fd86dd0a73a4","Type":"ContainerStarted","Data":"cb95d5c49eceb7457c059015eff3ae7c335ac2f8ec5be89189c8729984f357ce"} Apr 16 18:22:04.408452 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:04.408453 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" event={"ID":"416805d0-d41a-426b-b6d7-fd86dd0a73a4","Type":"ContainerStarted","Data":"405bb17bb86b5e3f7b0e72924859edfb408d224770d36e50bf164042bce0d3b0"} Apr 16 18:22:05.418481 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:05.418444 2573 generic.go:358] "Generic (PLEG): container finished" podID="416805d0-d41a-426b-b6d7-fd86dd0a73a4" containerID="cb95d5c49eceb7457c059015eff3ae7c335ac2f8ec5be89189c8729984f357ce" exitCode=0 Apr 16 18:22:05.418936 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:05.418539 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" event={"ID":"416805d0-d41a-426b-b6d7-fd86dd0a73a4","Type":"ContainerDied","Data":"cb95d5c49eceb7457c059015eff3ae7c335ac2f8ec5be89189c8729984f357ce"} Apr 16 18:22:06.423260 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:06.423172 2573 generic.go:358] "Generic (PLEG): container finished" podID="416805d0-d41a-426b-b6d7-fd86dd0a73a4" containerID="eca69fcd5df13bbdb4a86f647a455c3994463db9d59e0783f38cd0edb94eb228" exitCode=0 Apr 16 18:22:06.423260 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:06.423231 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" event={"ID":"416805d0-d41a-426b-b6d7-fd86dd0a73a4","Type":"ContainerDied","Data":"eca69fcd5df13bbdb4a86f647a455c3994463db9d59e0783f38cd0edb94eb228"} Apr 16 18:22:07.429070 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:07.429034 2573 generic.go:358] "Generic (PLEG): container finished" podID="416805d0-d41a-426b-b6d7-fd86dd0a73a4" containerID="7f67fd7a522528e63409da872ee569f356d8d56026a2244d66cec2a38140a204" exitCode=0 Apr 16 18:22:07.429433 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:07.429094 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" event={"ID":"416805d0-d41a-426b-b6d7-fd86dd0a73a4","Type":"ContainerDied","Data":"7f67fd7a522528e63409da872ee569f356d8d56026a2244d66cec2a38140a204"} Apr 16 18:22:08.558739 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:08.558710 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:08.622584 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:08.622547 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-util\") pod \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " Apr 16 18:22:08.622739 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:08.622606 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-bundle\") pod \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " Apr 16 18:22:08.622739 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:08.622664 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bw2w\" (UniqueName: \"kubernetes.io/projected/416805d0-d41a-426b-b6d7-fd86dd0a73a4-kube-api-access-2bw2w\") pod \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\" (UID: \"416805d0-d41a-426b-b6d7-fd86dd0a73a4\") " Apr 16 18:22:08.623341 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:08.623301 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-bundle" (OuterVolumeSpecName: "bundle") pod "416805d0-d41a-426b-b6d7-fd86dd0a73a4" (UID: "416805d0-d41a-426b-b6d7-fd86dd0a73a4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:08.624916 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:08.624894 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/416805d0-d41a-426b-b6d7-fd86dd0a73a4-kube-api-access-2bw2w" (OuterVolumeSpecName: "kube-api-access-2bw2w") pod "416805d0-d41a-426b-b6d7-fd86dd0a73a4" (UID: "416805d0-d41a-426b-b6d7-fd86dd0a73a4"). InnerVolumeSpecName "kube-api-access-2bw2w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:08.627918 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:08.627877 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-util" (OuterVolumeSpecName: "util") pod "416805d0-d41a-426b-b6d7-fd86dd0a73a4" (UID: "416805d0-d41a-426b-b6d7-fd86dd0a73a4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:08.723500 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:08.723403 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-util\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:22:08.723500 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:08.723439 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/416805d0-d41a-426b-b6d7-fd86dd0a73a4-bundle\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:22:08.723500 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:08.723452 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2bw2w\" (UniqueName: \"kubernetes.io/projected/416805d0-d41a-426b-b6d7-fd86dd0a73a4-kube-api-access-2bw2w\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:22:09.436532 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:09.436478 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" event={"ID":"416805d0-d41a-426b-b6d7-fd86dd0a73a4","Type":"ContainerDied","Data":"405bb17bb86b5e3f7b0e72924859edfb408d224770d36e50bf164042bce0d3b0"} Apr 16 18:22:09.436532 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:09.436534 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="405bb17bb86b5e3f7b0e72924859edfb408d224770d36e50bf164042bce0d3b0" Apr 16 18:22:09.436944 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:09.436487 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52t9fv" Apr 16 18:22:17.867593 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.867556 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl"] Apr 16 18:22:17.867955 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.867874 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="416805d0-d41a-426b-b6d7-fd86dd0a73a4" containerName="util" Apr 16 18:22:17.867955 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.867884 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="416805d0-d41a-426b-b6d7-fd86dd0a73a4" containerName="util" Apr 16 18:22:17.867955 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.867903 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="416805d0-d41a-426b-b6d7-fd86dd0a73a4" containerName="pull" Apr 16 18:22:17.867955 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.867908 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="416805d0-d41a-426b-b6d7-fd86dd0a73a4" containerName="pull" Apr 16 18:22:17.867955 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.867916 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="416805d0-d41a-426b-b6d7-fd86dd0a73a4" containerName="extract" Apr 16 18:22:17.867955 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.867922 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="416805d0-d41a-426b-b6d7-fd86dd0a73a4" containerName="extract" Apr 16 18:22:17.868130 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.867970 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="416805d0-d41a-426b-b6d7-fd86dd0a73a4" containerName="extract" Apr 16 18:22:17.870579 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.870562 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:17.872580 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.872559 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:22:17.872697 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.872609 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sfxb4\"" Apr 16 18:22:17.873101 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.873085 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:22:17.881035 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:17.881012 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl"] Apr 16 18:22:18.007187 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.007156 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:18.007362 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.007215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:18.007362 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.007247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wms79\" (UniqueName: \"kubernetes.io/projected/2daebe9b-0a75-47e7-8d24-09e844eb4583-kube-api-access-wms79\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:18.107922 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.107887 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:18.108122 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.107930 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:18.108122 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.107955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wms79\" (UniqueName: \"kubernetes.io/projected/2daebe9b-0a75-47e7-8d24-09e844eb4583-kube-api-access-wms79\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:18.108353 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.108331 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:18.108419 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.108346 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:18.119827 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.119773 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wms79\" (UniqueName: \"kubernetes.io/projected/2daebe9b-0a75-47e7-8d24-09e844eb4583-kube-api-access-wms79\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:18.179496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.179465 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:18.308195 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.308172 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl"] Apr 16 18:22:18.310269 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:22:18.310229 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2daebe9b_0a75_47e7_8d24_09e844eb4583.slice/crio-2196aaf41c220b94591046d752a07a1712e4e8542c179993f9b55e04baa80b01 WatchSource:0}: Error finding container 2196aaf41c220b94591046d752a07a1712e4e8542c179993f9b55e04baa80b01: Status 404 returned error can't find the container with id 2196aaf41c220b94591046d752a07a1712e4e8542c179993f9b55e04baa80b01 Apr 16 18:22:18.465974 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.465940 2573 generic.go:358] "Generic (PLEG): container finished" podID="2daebe9b-0a75-47e7-8d24-09e844eb4583" containerID="dc92a397a09c9fc3de54696af3e7f8ea5189cae8aba7960127d009eddcc2c588" exitCode=0 Apr 16 18:22:18.466131 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.465980 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" event={"ID":"2daebe9b-0a75-47e7-8d24-09e844eb4583","Type":"ContainerDied","Data":"dc92a397a09c9fc3de54696af3e7f8ea5189cae8aba7960127d009eddcc2c588"} Apr 16 18:22:18.466131 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:18.466000 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" event={"ID":"2daebe9b-0a75-47e7-8d24-09e844eb4583","Type":"ContainerStarted","Data":"2196aaf41c220b94591046d752a07a1712e4e8542c179993f9b55e04baa80b01"} Apr 16 18:22:21.476831 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:21.476793 2573 generic.go:358] "Generic (PLEG): container finished" podID="2daebe9b-0a75-47e7-8d24-09e844eb4583" containerID="5e0c9a782bbbf7ad4f74afbef3db0069c0dfd8728fa46fedeb5a88b8bbb7f5bd" exitCode=0 Apr 16 18:22:21.477245 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:21.476844 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" event={"ID":"2daebe9b-0a75-47e7-8d24-09e844eb4583","Type":"ContainerDied","Data":"5e0c9a782bbbf7ad4f74afbef3db0069c0dfd8728fa46fedeb5a88b8bbb7f5bd"} Apr 16 18:22:22.481883 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:22.481848 2573 generic.go:358] "Generic (PLEG): container finished" podID="2daebe9b-0a75-47e7-8d24-09e844eb4583" containerID="23da5c9b9da96f0caab8c09462249107a0e5ef6482ee0bec44ca2b3cb435a6ee" exitCode=0 Apr 16 18:22:22.482259 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:22.481902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" event={"ID":"2daebe9b-0a75-47e7-8d24-09e844eb4583","Type":"ContainerDied","Data":"23da5c9b9da96f0caab8c09462249107a0e5ef6482ee0bec44ca2b3cb435a6ee"} Apr 16 18:22:23.616559 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:23.616534 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:23.759754 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:23.759650 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-bundle\") pod \"2daebe9b-0a75-47e7-8d24-09e844eb4583\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " Apr 16 18:22:23.759912 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:23.759764 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-util\") pod \"2daebe9b-0a75-47e7-8d24-09e844eb4583\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " Apr 16 18:22:23.759912 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:23.759803 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wms79\" (UniqueName: \"kubernetes.io/projected/2daebe9b-0a75-47e7-8d24-09e844eb4583-kube-api-access-wms79\") pod \"2daebe9b-0a75-47e7-8d24-09e844eb4583\" (UID: \"2daebe9b-0a75-47e7-8d24-09e844eb4583\") " Apr 16 18:22:23.760152 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:23.760122 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-bundle" (OuterVolumeSpecName: "bundle") pod "2daebe9b-0a75-47e7-8d24-09e844eb4583" (UID: "2daebe9b-0a75-47e7-8d24-09e844eb4583"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:23.762187 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:23.762159 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2daebe9b-0a75-47e7-8d24-09e844eb4583-kube-api-access-wms79" (OuterVolumeSpecName: "kube-api-access-wms79") pod "2daebe9b-0a75-47e7-8d24-09e844eb4583" (UID: "2daebe9b-0a75-47e7-8d24-09e844eb4583"). InnerVolumeSpecName "kube-api-access-wms79". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:23.777224 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:23.777169 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-util" (OuterVolumeSpecName: "util") pod "2daebe9b-0a75-47e7-8d24-09e844eb4583" (UID: "2daebe9b-0a75-47e7-8d24-09e844eb4583"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:23.860453 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:23.860414 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-bundle\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:22:23.860453 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:23.860442 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2daebe9b-0a75-47e7-8d24-09e844eb4583-util\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:22:23.860453 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:23.860451 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wms79\" (UniqueName: \"kubernetes.io/projected/2daebe9b-0a75-47e7-8d24-09e844eb4583-kube-api-access-wms79\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:22:24.490587 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:24.490547 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" event={"ID":"2daebe9b-0a75-47e7-8d24-09e844eb4583","Type":"ContainerDied","Data":"2196aaf41c220b94591046d752a07a1712e4e8542c179993f9b55e04baa80b01"} Apr 16 18:22:24.490587 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:24.490581 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2196aaf41c220b94591046d752a07a1712e4e8542c179993f9b55e04baa80b01" Apr 16 18:22:24.490831 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:24.490622 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg7msl" Apr 16 18:22:40.621788 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.621754 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc"] Apr 16 18:22:40.622213 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.622077 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2daebe9b-0a75-47e7-8d24-09e844eb4583" containerName="pull" Apr 16 18:22:40.622213 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.622087 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2daebe9b-0a75-47e7-8d24-09e844eb4583" containerName="pull" Apr 16 18:22:40.622213 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.622097 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2daebe9b-0a75-47e7-8d24-09e844eb4583" containerName="extract" Apr 16 18:22:40.622213 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.622103 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2daebe9b-0a75-47e7-8d24-09e844eb4583" containerName="extract" Apr 16 18:22:40.622213 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.622112 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2daebe9b-0a75-47e7-8d24-09e844eb4583" containerName="util" Apr 16 18:22:40.622213 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.622117 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2daebe9b-0a75-47e7-8d24-09e844eb4583" containerName="util" Apr 16 18:22:40.622213 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.622171 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2daebe9b-0a75-47e7-8d24-09e844eb4583" containerName="extract" Apr 16 18:22:40.626731 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.626713 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:40.628915 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.628896 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:22:40.629391 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.629375 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:22:40.629445 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.629402 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sfxb4\"" Apr 16 18:22:40.632891 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.632864 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc"] Apr 16 18:22:40.689840 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.689802 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttjx7\" (UniqueName: \"kubernetes.io/projected/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-kube-api-access-ttjx7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:40.690010 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.689870 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:40.690010 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.689889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:40.790879 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.790827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttjx7\" (UniqueName: \"kubernetes.io/projected/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-kube-api-access-ttjx7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:40.791065 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.790942 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:40.791065 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.790973 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:40.791370 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.791353 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:40.791454 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.791386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:40.798257 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.798229 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttjx7\" (UniqueName: \"kubernetes.io/projected/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-kube-api-access-ttjx7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:40.937139 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:40.937101 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:41.063680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:41.063643 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc"] Apr 16 18:22:41.067663 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:22:41.067634 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f6a761b_d5ef_407a_a7ec_7ca8fa134494.slice/crio-8f363a6ec02d0784b7354d8e599644e0af5890a325f172773972d4398b6b7b67 WatchSource:0}: Error finding container 8f363a6ec02d0784b7354d8e599644e0af5890a325f172773972d4398b6b7b67: Status 404 returned error can't find the container with id 8f363a6ec02d0784b7354d8e599644e0af5890a325f172773972d4398b6b7b67 Apr 16 18:22:41.551558 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:41.551493 2573 generic.go:358] "Generic (PLEG): container finished" podID="3f6a761b-d5ef-407a-a7ec-7ca8fa134494" containerID="43a43c9ad31e65605c6525c7a615a2fb4045eb0998e275be90aa314b8fb19fb7" exitCode=0 Apr 16 18:22:41.551726 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:41.551559 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" event={"ID":"3f6a761b-d5ef-407a-a7ec-7ca8fa134494","Type":"ContainerDied","Data":"43a43c9ad31e65605c6525c7a615a2fb4045eb0998e275be90aa314b8fb19fb7"} Apr 16 18:22:41.551726 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:41.551603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" event={"ID":"3f6a761b-d5ef-407a-a7ec-7ca8fa134494","Type":"ContainerStarted","Data":"8f363a6ec02d0784b7354d8e599644e0af5890a325f172773972d4398b6b7b67"} Apr 16 18:22:51.586371 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:51.586345 2573 generic.go:358] "Generic (PLEG): container finished" podID="3f6a761b-d5ef-407a-a7ec-7ca8fa134494" containerID="241ee6d122767e96d53bf2f4c20d74798169aa87ff6a61269b3b42317cf6d5ed" exitCode=0 Apr 16 18:22:51.586674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:51.586431 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" event={"ID":"3f6a761b-d5ef-407a-a7ec-7ca8fa134494","Type":"ContainerDied","Data":"241ee6d122767e96d53bf2f4c20d74798169aa87ff6a61269b3b42317cf6d5ed"} Apr 16 18:22:52.592941 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:52.592902 2573 generic.go:358] "Generic (PLEG): container finished" podID="3f6a761b-d5ef-407a-a7ec-7ca8fa134494" containerID="6b14e89fd19aea2e480bbe6debbffe6583686d9cfb0e3b6df92939842da2af31" exitCode=0 Apr 16 18:22:52.593351 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:52.592961 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" event={"ID":"3f6a761b-d5ef-407a-a7ec-7ca8fa134494","Type":"ContainerDied","Data":"6b14e89fd19aea2e480bbe6debbffe6583686d9cfb0e3b6df92939842da2af31"} Apr 16 18:22:53.723139 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:53.723112 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:22:53.808319 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:53.808283 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttjx7\" (UniqueName: \"kubernetes.io/projected/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-kube-api-access-ttjx7\") pod \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " Apr 16 18:22:53.808494 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:53.808357 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-bundle\") pod \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " Apr 16 18:22:53.808494 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:53.808376 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-util\") pod \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\" (UID: \"3f6a761b-d5ef-407a-a7ec-7ca8fa134494\") " Apr 16 18:22:53.809294 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:53.809269 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-bundle" (OuterVolumeSpecName: "bundle") pod "3f6a761b-d5ef-407a-a7ec-7ca8fa134494" (UID: "3f6a761b-d5ef-407a-a7ec-7ca8fa134494"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:53.810746 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:53.810715 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-kube-api-access-ttjx7" (OuterVolumeSpecName: "kube-api-access-ttjx7") pod "3f6a761b-d5ef-407a-a7ec-7ca8fa134494" (UID: "3f6a761b-d5ef-407a-a7ec-7ca8fa134494"). InnerVolumeSpecName "kube-api-access-ttjx7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:53.813432 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:53.813411 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-util" (OuterVolumeSpecName: "util") pod "3f6a761b-d5ef-407a-a7ec-7ca8fa134494" (UID: "3f6a761b-d5ef-407a-a7ec-7ca8fa134494"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:53.909685 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:53.909658 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ttjx7\" (UniqueName: \"kubernetes.io/projected/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-kube-api-access-ttjx7\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:22:53.909685 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:53.909682 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-bundle\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:22:53.909685 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:53.909694 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f6a761b-d5ef-407a-a7ec-7ca8fa134494-util\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:22:54.600775 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:54.600742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" event={"ID":"3f6a761b-d5ef-407a-a7ec-7ca8fa134494","Type":"ContainerDied","Data":"8f363a6ec02d0784b7354d8e599644e0af5890a325f172773972d4398b6b7b67"} Apr 16 18:22:54.600775 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:54.600775 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f363a6ec02d0784b7354d8e599644e0af5890a325f172773972d4398b6b7b67" Apr 16 18:22:54.600976 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:22:54.600794 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835c89qc" Apr 16 18:23:00.302534 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.302479 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2"] Apr 16 18:23:00.302963 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.302820 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f6a761b-d5ef-407a-a7ec-7ca8fa134494" containerName="extract" Apr 16 18:23:00.302963 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.302832 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6a761b-d5ef-407a-a7ec-7ca8fa134494" containerName="extract" Apr 16 18:23:00.302963 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.302841 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f6a761b-d5ef-407a-a7ec-7ca8fa134494" containerName="util" Apr 16 18:23:00.302963 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.302846 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6a761b-d5ef-407a-a7ec-7ca8fa134494" containerName="util" Apr 16 18:23:00.302963 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.302860 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f6a761b-d5ef-407a-a7ec-7ca8fa134494" containerName="pull" Apr 16 18:23:00.302963 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.302866 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6a761b-d5ef-407a-a7ec-7ca8fa134494" containerName="pull" Apr 16 18:23:00.302963 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.302911 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f6a761b-d5ef-407a-a7ec-7ca8fa134494" containerName="extract" Apr 16 18:23:00.308113 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.308093 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:00.310656 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.310624 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:23:00.310789 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.310711 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sfxb4\"" Apr 16 18:23:00.310789 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.310730 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:23:00.315614 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.315590 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2"] Apr 16 18:23:00.362544 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.362488 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:00.362707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.362558 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zp6b\" (UniqueName: \"kubernetes.io/projected/d74a680c-840a-4bc7-b968-8a376255edbb-kube-api-access-2zp6b\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:00.362707 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.362620 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:00.464105 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.464051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:00.464268 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.464114 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zp6b\" (UniqueName: \"kubernetes.io/projected/d74a680c-840a-4bc7-b968-8a376255edbb-kube-api-access-2zp6b\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:00.464268 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.464170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:00.464480 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.464461 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:00.464571 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.464553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:00.479697 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.479666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zp6b\" (UniqueName: \"kubernetes.io/projected/d74a680c-840a-4bc7-b968-8a376255edbb-kube-api-access-2zp6b\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:00.618791 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.618703 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:00.757790 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:00.757749 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2"] Apr 16 18:23:00.761121 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:23:00.761096 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74a680c_840a_4bc7_b968_8a376255edbb.slice/crio-bab47e12606e670c9a0c7d952d0d36fed977205ac33d33600c6a5ce63fa4ff86 WatchSource:0}: Error finding container bab47e12606e670c9a0c7d952d0d36fed977205ac33d33600c6a5ce63fa4ff86: Status 404 returned error can't find the container with id bab47e12606e670c9a0c7d952d0d36fed977205ac33d33600c6a5ce63fa4ff86 Apr 16 18:23:01.626542 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.626497 2573 generic.go:358] "Generic (PLEG): container finished" podID="d74a680c-840a-4bc7-b968-8a376255edbb" containerID="a4ff973ba2a43795ebeb4e7f82da78d3561d541bf104b6e4675b3fd707fd65aa" exitCode=0 Apr 16 18:23:01.626907 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.626584 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" event={"ID":"d74a680c-840a-4bc7-b968-8a376255edbb","Type":"ContainerDied","Data":"a4ff973ba2a43795ebeb4e7f82da78d3561d541bf104b6e4675b3fd707fd65aa"} Apr 16 18:23:01.626907 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.626629 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" event={"ID":"d74a680c-840a-4bc7-b968-8a376255edbb","Type":"ContainerStarted","Data":"bab47e12606e670c9a0c7d952d0d36fed977205ac33d33600c6a5ce63fa4ff86"} Apr 16 18:23:01.802582 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.802478 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq"] Apr 16 18:23:01.805919 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.805900 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" Apr 16 18:23:01.808216 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.808197 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 18:23:01.808448 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.808435 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-7qqbz\"" Apr 16 18:23:01.808778 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.808762 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 18:23:01.836280 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.836236 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq"] Apr 16 18:23:01.877222 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.877188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/c8ac2fa4-9533-4e70-bf30-163c4ecfbba4-operator-config\") pod \"servicemesh-operator3-55f49c5f94-7hgzq\" (UID: \"c8ac2fa4-9533-4e70-bf30-163c4ecfbba4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" Apr 16 18:23:01.877388 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.877240 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dhrk\" (UniqueName: \"kubernetes.io/projected/c8ac2fa4-9533-4e70-bf30-163c4ecfbba4-kube-api-access-7dhrk\") pod \"servicemesh-operator3-55f49c5f94-7hgzq\" (UID: \"c8ac2fa4-9533-4e70-bf30-163c4ecfbba4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" Apr 16 18:23:01.977718 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.977685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dhrk\" (UniqueName: \"kubernetes.io/projected/c8ac2fa4-9533-4e70-bf30-163c4ecfbba4-kube-api-access-7dhrk\") pod \"servicemesh-operator3-55f49c5f94-7hgzq\" (UID: \"c8ac2fa4-9533-4e70-bf30-163c4ecfbba4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" Apr 16 18:23:01.977903 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.977763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/c8ac2fa4-9533-4e70-bf30-163c4ecfbba4-operator-config\") pod \"servicemesh-operator3-55f49c5f94-7hgzq\" (UID: \"c8ac2fa4-9533-4e70-bf30-163c4ecfbba4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" Apr 16 18:23:01.980338 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.980315 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/c8ac2fa4-9533-4e70-bf30-163c4ecfbba4-operator-config\") pod \"servicemesh-operator3-55f49c5f94-7hgzq\" (UID: \"c8ac2fa4-9533-4e70-bf30-163c4ecfbba4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" Apr 16 18:23:01.987625 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:01.987596 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dhrk\" (UniqueName: \"kubernetes.io/projected/c8ac2fa4-9533-4e70-bf30-163c4ecfbba4-kube-api-access-7dhrk\") pod \"servicemesh-operator3-55f49c5f94-7hgzq\" (UID: \"c8ac2fa4-9533-4e70-bf30-163c4ecfbba4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" Apr 16 18:23:02.115108 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:02.115020 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" Apr 16 18:23:02.246533 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:02.246487 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq"] Apr 16 18:23:02.247370 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:23:02.247346 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8ac2fa4_9533_4e70_bf30_163c4ecfbba4.slice/crio-b1c65629b4bb792680b4d171eac575c7546244d892f2fcca09a5afe01c38daf8 WatchSource:0}: Error finding container b1c65629b4bb792680b4d171eac575c7546244d892f2fcca09a5afe01c38daf8: Status 404 returned error can't find the container with id b1c65629b4bb792680b4d171eac575c7546244d892f2fcca09a5afe01c38daf8 Apr 16 18:23:02.631805 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:02.631772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" event={"ID":"c8ac2fa4-9533-4e70-bf30-163c4ecfbba4","Type":"ContainerStarted","Data":"b1c65629b4bb792680b4d171eac575c7546244d892f2fcca09a5afe01c38daf8"} Apr 16 18:23:02.633386 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:02.633363 2573 generic.go:358] "Generic (PLEG): container finished" podID="d74a680c-840a-4bc7-b968-8a376255edbb" containerID="11e96e12cf4a7727f3b4f4b0df5203aeabc4498e2249c002af66032730a50e69" exitCode=0 Apr 16 18:23:02.633509 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:02.633413 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" event={"ID":"d74a680c-840a-4bc7-b968-8a376255edbb","Type":"ContainerDied","Data":"11e96e12cf4a7727f3b4f4b0df5203aeabc4498e2249c002af66032730a50e69"} Apr 16 18:23:03.639126 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:03.639094 2573 generic.go:358] "Generic (PLEG): container finished" podID="d74a680c-840a-4bc7-b968-8a376255edbb" containerID="602ee72b9feff54533851acebd6d28619410998e69762844715b8b5c11808cc0" exitCode=0 Apr 16 18:23:03.639485 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:03.639155 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" event={"ID":"d74a680c-840a-4bc7-b968-8a376255edbb","Type":"ContainerDied","Data":"602ee72b9feff54533851acebd6d28619410998e69762844715b8b5c11808cc0"} Apr 16 18:23:04.773184 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:04.773156 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:04.902612 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:04.902581 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-util\") pod \"d74a680c-840a-4bc7-b968-8a376255edbb\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " Apr 16 18:23:04.902821 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:04.902676 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-bundle\") pod \"d74a680c-840a-4bc7-b968-8a376255edbb\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " Apr 16 18:23:04.902821 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:04.902717 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zp6b\" (UniqueName: \"kubernetes.io/projected/d74a680c-840a-4bc7-b968-8a376255edbb-kube-api-access-2zp6b\") pod \"d74a680c-840a-4bc7-b968-8a376255edbb\" (UID: \"d74a680c-840a-4bc7-b968-8a376255edbb\") " Apr 16 18:23:04.903632 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:04.903596 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-bundle" (OuterVolumeSpecName: "bundle") pod "d74a680c-840a-4bc7-b968-8a376255edbb" (UID: "d74a680c-840a-4bc7-b968-8a376255edbb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:04.905445 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:04.905408 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74a680c-840a-4bc7-b968-8a376255edbb-kube-api-access-2zp6b" (OuterVolumeSpecName: "kube-api-access-2zp6b") pod "d74a680c-840a-4bc7-b968-8a376255edbb" (UID: "d74a680c-840a-4bc7-b968-8a376255edbb"). InnerVolumeSpecName "kube-api-access-2zp6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:23:04.908368 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:04.908342 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-util" (OuterVolumeSpecName: "util") pod "d74a680c-840a-4bc7-b968-8a376255edbb" (UID: "d74a680c-840a-4bc7-b968-8a376255edbb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:05.003579 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.003536 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2zp6b\" (UniqueName: \"kubernetes.io/projected/d74a680c-840a-4bc7-b968-8a376255edbb-kube-api-access-2zp6b\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:05.003579 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.003570 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-util\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:05.003579 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.003581 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d74a680c-840a-4bc7-b968-8a376255edbb-bundle\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:05.648449 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.648412 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" event={"ID":"c8ac2fa4-9533-4e70-bf30-163c4ecfbba4","Type":"ContainerStarted","Data":"760da8232daee5ae46c9f180826e3b2d900b88491654cc6e8f2a8f8d1155d5db"} Apr 16 18:23:05.648682 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.648618 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" Apr 16 18:23:05.650414 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.650369 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" event={"ID":"d74a680c-840a-4bc7-b968-8a376255edbb","Type":"ContainerDied","Data":"bab47e12606e670c9a0c7d952d0d36fed977205ac33d33600c6a5ce63fa4ff86"} Apr 16 18:23:05.650414 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.650403 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bab47e12606e670c9a0c7d952d0d36fed977205ac33d33600c6a5ce63fa4ff86" Apr 16 18:23:05.650414 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.650406 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c29g8g2" Apr 16 18:23:05.676587 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.675674 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" podStartSLOduration=2.3370095539999998 podStartE2EDuration="4.67565105s" podCreationTimestamp="2026-04-16 18:23:01 +0000 UTC" firstStartedPulling="2026-04-16 18:23:02.249926814 +0000 UTC m=+453.606068849" lastFinishedPulling="2026-04-16 18:23:04.588568298 +0000 UTC m=+455.944710345" observedRunningTime="2026-04-16 18:23:05.671670159 +0000 UTC m=+457.027812215" watchObservedRunningTime="2026-04-16 18:23:05.67565105 +0000 UTC m=+457.031793108" Apr 16 18:23:05.695785 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.695746 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l"] Apr 16 18:23:05.696115 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.696103 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d74a680c-840a-4bc7-b968-8a376255edbb" containerName="pull" Apr 16 18:23:05.696159 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.696118 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74a680c-840a-4bc7-b968-8a376255edbb" containerName="pull" Apr 16 18:23:05.696159 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.696128 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d74a680c-840a-4bc7-b968-8a376255edbb" containerName="extract" Apr 16 18:23:05.696159 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.696133 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74a680c-840a-4bc7-b968-8a376255edbb" containerName="extract" Apr 16 18:23:05.696159 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.696144 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d74a680c-840a-4bc7-b968-8a376255edbb" containerName="util" Apr 16 18:23:05.696159 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.696150 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74a680c-840a-4bc7-b968-8a376255edbb" containerName="util" Apr 16 18:23:05.696309 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.696208 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d74a680c-840a-4bc7-b968-8a376255edbb" containerName="extract" Apr 16 18:23:05.700408 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.700385 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.702434 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.702406 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 18:23:05.702578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.702470 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-5fqs7\"" Apr 16 18:23:05.702578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.702406 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 18:23:05.702578 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.702494 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 18:23:05.702886 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.702871 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 18:23:05.713602 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.713572 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l"] Apr 16 18:23:05.811071 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.811037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.811456 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.811079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.811456 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.811141 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.811456 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.811192 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.811456 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.811230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzvfs\" (UniqueName: \"kubernetes.io/projected/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-kube-api-access-xzvfs\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.811456 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.811274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.811456 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.811325 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.912660 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.912547 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.912660 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.912598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.912660 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.912641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.912660 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.912664 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.913055 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.912699 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.913055 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.912727 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.913055 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.912783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzvfs\" (UniqueName: \"kubernetes.io/projected/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-kube-api-access-xzvfs\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.913535 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.913470 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.915576 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.915554 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.915764 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.915747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.915828 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.915807 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.915867 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.915833 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.925290 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.925252 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzvfs\" (UniqueName: \"kubernetes.io/projected/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-kube-api-access-xzvfs\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:05.925471 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:05.925452 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f90a552b-6669-4d7f-9d2a-6ed675ff2a01-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-6pj7l\" (UID: \"f90a552b-6669-4d7f-9d2a-6ed675ff2a01\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:06.010721 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:06.010679 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:06.144901 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:06.144866 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l"] Apr 16 18:23:06.147895 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:23:06.147859 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf90a552b_6669_4d7f_9d2a_6ed675ff2a01.slice/crio-bd76ec412d45c7346846efe78c9fba793c7150e25ac3be9ccf69e8b4b91aa8d6 WatchSource:0}: Error finding container bd76ec412d45c7346846efe78c9fba793c7150e25ac3be9ccf69e8b4b91aa8d6: Status 404 returned error can't find the container with id bd76ec412d45c7346846efe78c9fba793c7150e25ac3be9ccf69e8b4b91aa8d6 Apr 16 18:23:06.655588 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:06.655549 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" event={"ID":"f90a552b-6669-4d7f-9d2a-6ed675ff2a01","Type":"ContainerStarted","Data":"bd76ec412d45c7346846efe78c9fba793c7150e25ac3be9ccf69e8b4b91aa8d6"} Apr 16 18:23:08.586311 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:08.586267 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892148Ki","pods":"250"} Apr 16 18:23:08.586608 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:08.586357 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892148Ki","pods":"250"} Apr 16 18:23:09.669697 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:09.669655 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" event={"ID":"f90a552b-6669-4d7f-9d2a-6ed675ff2a01","Type":"ContainerStarted","Data":"156751ca73a022836b43c2842b67c7a9a2181adf40ae81a917491564c2061d4f"} Apr 16 18:23:09.670110 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:09.669871 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:09.671545 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:09.671508 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" Apr 16 18:23:09.691382 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:09.691319 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-6pj7l" podStartSLOduration=2.255380961 podStartE2EDuration="4.691300641s" podCreationTimestamp="2026-04-16 18:23:05 +0000 UTC" firstStartedPulling="2026-04-16 18:23:06.150065061 +0000 UTC m=+457.506207095" lastFinishedPulling="2026-04-16 18:23:08.585984729 +0000 UTC m=+459.942126775" observedRunningTime="2026-04-16 18:23:09.690966505 +0000 UTC m=+461.047108563" watchObservedRunningTime="2026-04-16 18:23:09.691300641 +0000 UTC m=+461.047442700" Apr 16 18:23:13.306291 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.306250 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn"] Apr 16 18:23:13.310551 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.310509 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.313174 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.313142 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-rrwbw\"" Apr 16 18:23:13.331584 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.331554 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn"] Apr 16 18:23:13.487035 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.486987 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.487233 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.487045 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.487233 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.487149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.487233 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.487192 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.487233 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.487218 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkrtn\" (UniqueName: \"kubernetes.io/projected/1b515d6e-9256-4415-9c2b-b201c18d1744-kube-api-access-xkrtn\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.487409 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.487252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.487409 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.487330 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.487409 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.487376 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1b515d6e-9256-4415-9c2b-b201c18d1744-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.487550 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.487411 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.588440 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.588440 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.588440 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588438 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.588784 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588457 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.588784 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588474 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkrtn\" (UniqueName: \"kubernetes.io/projected/1b515d6e-9256-4415-9c2b-b201c18d1744-kube-api-access-xkrtn\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.588784 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588494 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.588784 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588541 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.588784 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588762 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1b515d6e-9256-4415-9c2b-b201c18d1744-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.589025 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.589025 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588845 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.589025 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.589025 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.588918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.589214 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.589045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.589537 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.589485 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1b515d6e-9256-4415-9c2b-b201c18d1744-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.591052 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.591031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.591141 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.591099 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.596426 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.596398 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkrtn\" (UniqueName: \"kubernetes.io/projected/1b515d6e-9256-4415-9c2b-b201c18d1744-kube-api-access-xkrtn\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.596503 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.596452 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1b515d6e-9256-4415-9c2b-b201c18d1744-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-btdrn\" (UID: \"1b515d6e-9256-4415-9c2b-b201c18d1744\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.629106 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.629071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:13.767474 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:13.767432 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn"] Apr 16 18:23:13.773337 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:23:13.773302 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b515d6e_9256_4415_9c2b_b201c18d1744.slice/crio-58bf307a4503a9025626ead52c26cbfc3a4468ef3f3138ec82519fcf5bc2dcae WatchSource:0}: Error finding container 58bf307a4503a9025626ead52c26cbfc3a4468ef3f3138ec82519fcf5bc2dcae: Status 404 returned error can't find the container with id 58bf307a4503a9025626ead52c26cbfc3a4468ef3f3138ec82519fcf5bc2dcae Apr 16 18:23:14.690602 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:14.690555 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" event={"ID":"1b515d6e-9256-4415-9c2b-b201c18d1744","Type":"ContainerStarted","Data":"58bf307a4503a9025626ead52c26cbfc3a4468ef3f3138ec82519fcf5bc2dcae"} Apr 16 18:23:16.208989 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:16.208948 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892148Ki","pods":"250"} Apr 16 18:23:16.209269 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:16.209026 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892148Ki","pods":"250"} Apr 16 18:23:16.209269 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:16.209075 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892148Ki","pods":"250"} Apr 16 18:23:16.658274 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:16.658237 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-7hgzq" Apr 16 18:23:16.700016 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:16.699985 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" event={"ID":"1b515d6e-9256-4415-9c2b-b201c18d1744","Type":"ContainerStarted","Data":"9e715be8000db2b5ed486b4002271a556928bc97284fd0c41aa0cab20c83f272"} Apr 16 18:23:16.729631 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:16.729580 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" podStartSLOduration=1.296187275 podStartE2EDuration="3.729564009s" podCreationTimestamp="2026-04-16 18:23:13 +0000 UTC" firstStartedPulling="2026-04-16 18:23:13.775270027 +0000 UTC m=+465.131412063" lastFinishedPulling="2026-04-16 18:23:16.208646753 +0000 UTC m=+467.564788797" observedRunningTime="2026-04-16 18:23:16.728147357 +0000 UTC m=+468.084289415" watchObservedRunningTime="2026-04-16 18:23:16.729564009 +0000 UTC m=+468.085706056" Apr 16 18:23:17.629781 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:17.629748 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:17.634500 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:17.634478 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:17.703832 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:17.703801 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:17.704869 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:17.704848 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-btdrn" Apr 16 18:23:23.313281 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.313240 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc"] Apr 16 18:23:23.317480 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.317457 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:23.319618 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.319597 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:23:23.319978 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.319957 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:23:23.319978 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.319977 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-sfxb4\"" Apr 16 18:23:23.325311 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.325287 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc"] Apr 16 18:23:23.374168 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.374132 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:23.374342 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.374183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:23.374342 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.374210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz85n\" (UniqueName: \"kubernetes.io/projected/dce716a3-e7d9-403d-a13e-f5d329c0d21a-kube-api-access-nz85n\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:23.410885 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.410845 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr"] Apr 16 18:23:23.414797 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.414775 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:23.421695 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.421657 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr"] Apr 16 18:23:23.475265 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.475227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:23.475265 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.475271 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:23.475496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.475312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:23.475496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.475333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz85n\" (UniqueName: \"kubernetes.io/projected/dce716a3-e7d9-403d-a13e-f5d329c0d21a-kube-api-access-nz85n\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:23.475496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.475388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v2rr\" (UniqueName: \"kubernetes.io/projected/a8086120-d977-4efd-b29d-cdf897bddd1c-kube-api-access-8v2rr\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:23.475496 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.475407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:23.475676 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.475623 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:23.475676 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.475666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:23.483749 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.483723 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz85n\" (UniqueName: \"kubernetes.io/projected/dce716a3-e7d9-403d-a13e-f5d329c0d21a-kube-api-access-nz85n\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:23.516084 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.516041 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5"] Apr 16 18:23:23.520678 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.520652 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:23.530319 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.530278 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5"] Apr 16 18:23:23.576479 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.576394 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:23.576680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.576476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v2rr\" (UniqueName: \"kubernetes.io/projected/a8086120-d977-4efd-b29d-cdf897bddd1c-kube-api-access-8v2rr\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:23.576680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.576505 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:23.576680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.576560 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:23.576680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.576604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97bgd\" (UniqueName: \"kubernetes.io/projected/930915a3-a064-49e9-9979-63cbebb3fa05-kube-api-access-97bgd\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:23.576680 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.576641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:23.576995 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.576974 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:23.577055 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.577002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:23.584702 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.584680 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v2rr\" (UniqueName: \"kubernetes.io/projected/a8086120-d977-4efd-b29d-cdf897bddd1c-kube-api-access-8v2rr\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:23.618098 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.618062 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc"] Apr 16 18:23:23.621894 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.621878 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:23.627572 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.627547 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:23.630667 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.630645 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc"] Apr 16 18:23:23.677868 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.677834 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:23.678060 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.677900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:23.678060 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.677935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:23.678060 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.677969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97bgd\" (UniqueName: \"kubernetes.io/projected/930915a3-a064-49e9-9979-63cbebb3fa05-kube-api-access-97bgd\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:23.678060 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.678030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhlgf\" (UniqueName: \"kubernetes.io/projected/b1331095-4b25-4606-8b82-fdd6f9fa5395-kube-api-access-vhlgf\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:23.678278 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.678072 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:23.678382 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.678359 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:23.678447 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.678385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:23.687208 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.687149 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97bgd\" (UniqueName: \"kubernetes.io/projected/930915a3-a064-49e9-9979-63cbebb3fa05-kube-api-access-97bgd\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:23.725805 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.725769 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:23.770500 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.770410 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc"] Apr 16 18:23:23.778456 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.778433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:23.778587 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.778488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhlgf\" (UniqueName: \"kubernetes.io/projected/b1331095-4b25-4606-8b82-fdd6f9fa5395-kube-api-access-vhlgf\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:23.778688 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.778662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:23.778862 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.778842 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:23.779015 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.778996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:23.790027 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.790006 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhlgf\" (UniqueName: \"kubernetes.io/projected/b1331095-4b25-4606-8b82-fdd6f9fa5395-kube-api-access-vhlgf\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:23.832414 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.832386 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:23.871788 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.871758 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr"] Apr 16 18:23:23.873748 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:23:23.873716 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8086120_d977_4efd_b29d_cdf897bddd1c.slice/crio-a3d08aff6dc82caeda846cd3cfc60fd217f51c9708c1ea17449182d67b87f6ff WatchSource:0}: Error finding container a3d08aff6dc82caeda846cd3cfc60fd217f51c9708c1ea17449182d67b87f6ff: Status 404 returned error can't find the container with id a3d08aff6dc82caeda846cd3cfc60fd217f51c9708c1ea17449182d67b87f6ff Apr 16 18:23:23.932899 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.932865 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:23.974953 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:23.974932 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5"] Apr 16 18:23:23.977324 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:23:23.977299 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod930915a3_a064_49e9_9979_63cbebb3fa05.slice/crio-59013fc6f55cd17907f1e5b713182636871e04fda48f85fecf774cb2be8d161a WatchSource:0}: Error finding container 59013fc6f55cd17907f1e5b713182636871e04fda48f85fecf774cb2be8d161a: Status 404 returned error can't find the container with id 59013fc6f55cd17907f1e5b713182636871e04fda48f85fecf774cb2be8d161a Apr 16 18:23:24.077821 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.077793 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc"] Apr 16 18:23:24.079588 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:23:24.079558 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1331095_4b25_4606_8b82_fdd6f9fa5395.slice/crio-dc5ebefdf1352d67e0c0c9c0bf3f8fb72a1057bafe5e497f1ca652170887d266 WatchSource:0}: Error finding container dc5ebefdf1352d67e0c0c9c0bf3f8fb72a1057bafe5e497f1ca652170887d266: Status 404 returned error can't find the container with id dc5ebefdf1352d67e0c0c9c0bf3f8fb72a1057bafe5e497f1ca652170887d266 Apr 16 18:23:24.731218 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.731188 2573 generic.go:358] "Generic (PLEG): container finished" podID="930915a3-a064-49e9-9979-63cbebb3fa05" containerID="4b55d50acdfd404d005e5f744f88ff9a58fdd46da78b3e0928c2b039f8770ef8" exitCode=0 Apr 16 18:23:24.731673 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.731252 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" event={"ID":"930915a3-a064-49e9-9979-63cbebb3fa05","Type":"ContainerDied","Data":"4b55d50acdfd404d005e5f744f88ff9a58fdd46da78b3e0928c2b039f8770ef8"} Apr 16 18:23:24.731673 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.731283 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" event={"ID":"930915a3-a064-49e9-9979-63cbebb3fa05","Type":"ContainerStarted","Data":"59013fc6f55cd17907f1e5b713182636871e04fda48f85fecf774cb2be8d161a"} Apr 16 18:23:24.732756 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.732626 2573 generic.go:358] "Generic (PLEG): container finished" podID="a8086120-d977-4efd-b29d-cdf897bddd1c" containerID="59c0cd59d740170d98a627ca73613f1d632172fb6f09829b1fb43c7819a36885" exitCode=0 Apr 16 18:23:24.732756 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.732651 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" event={"ID":"a8086120-d977-4efd-b29d-cdf897bddd1c","Type":"ContainerDied","Data":"59c0cd59d740170d98a627ca73613f1d632172fb6f09829b1fb43c7819a36885"} Apr 16 18:23:24.732756 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.732684 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" event={"ID":"a8086120-d977-4efd-b29d-cdf897bddd1c","Type":"ContainerStarted","Data":"a3d08aff6dc82caeda846cd3cfc60fd217f51c9708c1ea17449182d67b87f6ff"} Apr 16 18:23:24.734050 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.734034 2573 generic.go:358] "Generic (PLEG): container finished" podID="b1331095-4b25-4606-8b82-fdd6f9fa5395" containerID="d50b64630c16cc9e1c442c3fc365b163170b8e9a0f187f759dd113fcb35dfd79" exitCode=0 Apr 16 18:23:24.734100 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.734063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" event={"ID":"b1331095-4b25-4606-8b82-fdd6f9fa5395","Type":"ContainerDied","Data":"d50b64630c16cc9e1c442c3fc365b163170b8e9a0f187f759dd113fcb35dfd79"} Apr 16 18:23:24.734100 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.734088 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" event={"ID":"b1331095-4b25-4606-8b82-fdd6f9fa5395","Type":"ContainerStarted","Data":"dc5ebefdf1352d67e0c0c9c0bf3f8fb72a1057bafe5e497f1ca652170887d266"} Apr 16 18:23:24.735626 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.735609 2573 generic.go:358] "Generic (PLEG): container finished" podID="dce716a3-e7d9-403d-a13e-f5d329c0d21a" containerID="6334e91f4b55a037f01a9a3548781318a87f3954922c6a3391ddcbf1d83d2024" exitCode=0 Apr 16 18:23:24.735701 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.735666 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" event={"ID":"dce716a3-e7d9-403d-a13e-f5d329c0d21a","Type":"ContainerDied","Data":"6334e91f4b55a037f01a9a3548781318a87f3954922c6a3391ddcbf1d83d2024"} Apr 16 18:23:24.735701 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:24.735682 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" event={"ID":"dce716a3-e7d9-403d-a13e-f5d329c0d21a","Type":"ContainerStarted","Data":"723e9522abf885fa6d3f908eace7c539baaa1faa4279e8f6d5b70cedf1568634"} Apr 16 18:23:25.740960 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:25.740928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" event={"ID":"b1331095-4b25-4606-8b82-fdd6f9fa5395","Type":"ContainerStarted","Data":"ce6acc06eeacb8f03b37404502b9995302d90374067a2c940622a56e1c4be6af"} Apr 16 18:23:25.742577 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:25.742554 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" event={"ID":"dce716a3-e7d9-403d-a13e-f5d329c0d21a","Type":"ContainerStarted","Data":"d041fc0d6c86ab1c3439f6b7b5013e4b422f135fe8981b7c0aa705238cd25322"} Apr 16 18:23:25.744201 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:25.744178 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" event={"ID":"a8086120-d977-4efd-b29d-cdf897bddd1c","Type":"ContainerStarted","Data":"1eb0a739f5ec5cc0fdfd6b51fb163ca28e324234fbb6c4e166aa5e79bc6c5f82"} Apr 16 18:23:26.749666 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:26.749631 2573 generic.go:358] "Generic (PLEG): container finished" podID="930915a3-a064-49e9-9979-63cbebb3fa05" containerID="7e8423aa63619cdb297425343b60d5b6fccea6a928a8079b8d5b988743735d27" exitCode=0 Apr 16 18:23:26.750103 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:26.749704 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" event={"ID":"930915a3-a064-49e9-9979-63cbebb3fa05","Type":"ContainerDied","Data":"7e8423aa63619cdb297425343b60d5b6fccea6a928a8079b8d5b988743735d27"} Apr 16 18:23:26.751252 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:26.751231 2573 generic.go:358] "Generic (PLEG): container finished" podID="a8086120-d977-4efd-b29d-cdf897bddd1c" containerID="1eb0a739f5ec5cc0fdfd6b51fb163ca28e324234fbb6c4e166aa5e79bc6c5f82" exitCode=0 Apr 16 18:23:26.751352 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:26.751279 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" event={"ID":"a8086120-d977-4efd-b29d-cdf897bddd1c","Type":"ContainerDied","Data":"1eb0a739f5ec5cc0fdfd6b51fb163ca28e324234fbb6c4e166aa5e79bc6c5f82"} Apr 16 18:23:26.753111 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:26.753092 2573 generic.go:358] "Generic (PLEG): container finished" podID="b1331095-4b25-4606-8b82-fdd6f9fa5395" containerID="ce6acc06eeacb8f03b37404502b9995302d90374067a2c940622a56e1c4be6af" exitCode=0 Apr 16 18:23:26.753229 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:26.753154 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" event={"ID":"b1331095-4b25-4606-8b82-fdd6f9fa5395","Type":"ContainerDied","Data":"ce6acc06eeacb8f03b37404502b9995302d90374067a2c940622a56e1c4be6af"} Apr 16 18:23:26.754774 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:26.754753 2573 generic.go:358] "Generic (PLEG): container finished" podID="dce716a3-e7d9-403d-a13e-f5d329c0d21a" containerID="d041fc0d6c86ab1c3439f6b7b5013e4b422f135fe8981b7c0aa705238cd25322" exitCode=0 Apr 16 18:23:26.754871 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:26.754825 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" event={"ID":"dce716a3-e7d9-403d-a13e-f5d329c0d21a","Type":"ContainerDied","Data":"d041fc0d6c86ab1c3439f6b7b5013e4b422f135fe8981b7c0aa705238cd25322"} Apr 16 18:23:27.760440 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:27.760406 2573 generic.go:358] "Generic (PLEG): container finished" podID="930915a3-a064-49e9-9979-63cbebb3fa05" containerID="3e43c6d40798507ef6326b92fb1e8e292bc5e80be30ad75ea26f9a7d9bbb7ad9" exitCode=0 Apr 16 18:23:27.760908 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:27.760472 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" event={"ID":"930915a3-a064-49e9-9979-63cbebb3fa05","Type":"ContainerDied","Data":"3e43c6d40798507ef6326b92fb1e8e292bc5e80be30ad75ea26f9a7d9bbb7ad9"} Apr 16 18:23:27.762247 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:27.762225 2573 generic.go:358] "Generic (PLEG): container finished" podID="a8086120-d977-4efd-b29d-cdf897bddd1c" containerID="f37224ca90cce4c8a37799d7ffa4869c5ad7e8f0d35891354b5ce228c65458ee" exitCode=0 Apr 16 18:23:27.762348 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:27.762321 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" event={"ID":"a8086120-d977-4efd-b29d-cdf897bddd1c","Type":"ContainerDied","Data":"f37224ca90cce4c8a37799d7ffa4869c5ad7e8f0d35891354b5ce228c65458ee"} Apr 16 18:23:27.764136 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:27.764110 2573 generic.go:358] "Generic (PLEG): container finished" podID="b1331095-4b25-4606-8b82-fdd6f9fa5395" containerID="083bc3543e1b3d4cd64b87b27fc1dd7a447ac83ef67ad6bf6c04ff8ec2e032dd" exitCode=0 Apr 16 18:23:27.764239 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:27.764170 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" event={"ID":"b1331095-4b25-4606-8b82-fdd6f9fa5395","Type":"ContainerDied","Data":"083bc3543e1b3d4cd64b87b27fc1dd7a447ac83ef67ad6bf6c04ff8ec2e032dd"} Apr 16 18:23:27.765936 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:27.765918 2573 generic.go:358] "Generic (PLEG): container finished" podID="dce716a3-e7d9-403d-a13e-f5d329c0d21a" containerID="199fb7cf3ee38098fdfa1d5e8f852764084a083f69707301662b459b8ba43fb7" exitCode=0 Apr 16 18:23:27.766011 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:27.765957 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" event={"ID":"dce716a3-e7d9-403d-a13e-f5d329c0d21a","Type":"ContainerDied","Data":"199fb7cf3ee38098fdfa1d5e8f852764084a083f69707301662b459b8ba43fb7"} Apr 16 18:23:28.903559 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:28.903537 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:28.929074 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:28.929050 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-util\") pod \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " Apr 16 18:23:28.929247 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:28.929168 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-bundle\") pod \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " Apr 16 18:23:28.929247 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:28.929214 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz85n\" (UniqueName: \"kubernetes.io/projected/dce716a3-e7d9-403d-a13e-f5d329c0d21a-kube-api-access-nz85n\") pod \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\" (UID: \"dce716a3-e7d9-403d-a13e-f5d329c0d21a\") " Apr 16 18:23:28.930187 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:28.930159 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-bundle" (OuterVolumeSpecName: "bundle") pod "dce716a3-e7d9-403d-a13e-f5d329c0d21a" (UID: "dce716a3-e7d9-403d-a13e-f5d329c0d21a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:28.932356 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:28.932319 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dce716a3-e7d9-403d-a13e-f5d329c0d21a-kube-api-access-nz85n" (OuterVolumeSpecName: "kube-api-access-nz85n") pod "dce716a3-e7d9-403d-a13e-f5d329c0d21a" (UID: "dce716a3-e7d9-403d-a13e-f5d329c0d21a"). InnerVolumeSpecName "kube-api-access-nz85n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:23:28.938094 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:28.938062 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-util" (OuterVolumeSpecName: "util") pod "dce716a3-e7d9-403d-a13e-f5d329c0d21a" (UID: "dce716a3-e7d9-403d-a13e-f5d329c0d21a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:28.949631 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:28.949611 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:28.978060 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:28.978040 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:28.981470 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:28.981450 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:29.030031 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.029956 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-bundle\") pod \"930915a3-a064-49e9-9979-63cbebb3fa05\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " Apr 16 18:23:29.030167 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030062 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-util\") pod \"b1331095-4b25-4606-8b82-fdd6f9fa5395\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " Apr 16 18:23:29.030167 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030113 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97bgd\" (UniqueName: \"kubernetes.io/projected/930915a3-a064-49e9-9979-63cbebb3fa05-kube-api-access-97bgd\") pod \"930915a3-a064-49e9-9979-63cbebb3fa05\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " Apr 16 18:23:29.030167 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030145 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v2rr\" (UniqueName: \"kubernetes.io/projected/a8086120-d977-4efd-b29d-cdf897bddd1c-kube-api-access-8v2rr\") pod \"a8086120-d977-4efd-b29d-cdf897bddd1c\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " Apr 16 18:23:29.030291 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030200 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-bundle\") pod \"a8086120-d977-4efd-b29d-cdf897bddd1c\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " Apr 16 18:23:29.030291 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030227 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-util\") pod \"a8086120-d977-4efd-b29d-cdf897bddd1c\" (UID: \"a8086120-d977-4efd-b29d-cdf897bddd1c\") " Apr 16 18:23:29.030291 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030249 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhlgf\" (UniqueName: \"kubernetes.io/projected/b1331095-4b25-4606-8b82-fdd6f9fa5395-kube-api-access-vhlgf\") pod \"b1331095-4b25-4606-8b82-fdd6f9fa5395\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " Apr 16 18:23:29.030291 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030277 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-util\") pod \"930915a3-a064-49e9-9979-63cbebb3fa05\" (UID: \"930915a3-a064-49e9-9979-63cbebb3fa05\") " Apr 16 18:23:29.030408 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030309 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-bundle\") pod \"b1331095-4b25-4606-8b82-fdd6f9fa5395\" (UID: \"b1331095-4b25-4606-8b82-fdd6f9fa5395\") " Apr 16 18:23:29.030856 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030550 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-bundle" (OuterVolumeSpecName: "bundle") pod "930915a3-a064-49e9-9979-63cbebb3fa05" (UID: "930915a3-a064-49e9-9979-63cbebb3fa05"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:29.030856 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030630 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-util\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.030856 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030650 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dce716a3-e7d9-403d-a13e-f5d329c0d21a-bundle\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.030856 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030664 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-bundle\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.030856 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.030678 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nz85n\" (UniqueName: \"kubernetes.io/projected/dce716a3-e7d9-403d-a13e-f5d329c0d21a-kube-api-access-nz85n\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.031225 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.031179 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-bundle" (OuterVolumeSpecName: "bundle") pod "b1331095-4b25-4606-8b82-fdd6f9fa5395" (UID: "b1331095-4b25-4606-8b82-fdd6f9fa5395"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:29.031359 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.031334 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-bundle" (OuterVolumeSpecName: "bundle") pod "a8086120-d977-4efd-b29d-cdf897bddd1c" (UID: "a8086120-d977-4efd-b29d-cdf897bddd1c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:29.033190 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.033160 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/930915a3-a064-49e9-9979-63cbebb3fa05-kube-api-access-97bgd" (OuterVolumeSpecName: "kube-api-access-97bgd") pod "930915a3-a064-49e9-9979-63cbebb3fa05" (UID: "930915a3-a064-49e9-9979-63cbebb3fa05"). InnerVolumeSpecName "kube-api-access-97bgd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:23:29.033823 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.033792 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8086120-d977-4efd-b29d-cdf897bddd1c-kube-api-access-8v2rr" (OuterVolumeSpecName: "kube-api-access-8v2rr") pod "a8086120-d977-4efd-b29d-cdf897bddd1c" (UID: "a8086120-d977-4efd-b29d-cdf897bddd1c"). InnerVolumeSpecName "kube-api-access-8v2rr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:23:29.033925 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.033836 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1331095-4b25-4606-8b82-fdd6f9fa5395-kube-api-access-vhlgf" (OuterVolumeSpecName: "kube-api-access-vhlgf") pod "b1331095-4b25-4606-8b82-fdd6f9fa5395" (UID: "b1331095-4b25-4606-8b82-fdd6f9fa5395"). InnerVolumeSpecName "kube-api-access-vhlgf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:23:29.035889 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.035867 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-util" (OuterVolumeSpecName: "util") pod "a8086120-d977-4efd-b29d-cdf897bddd1c" (UID: "a8086120-d977-4efd-b29d-cdf897bddd1c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:29.038351 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.038326 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-util" (OuterVolumeSpecName: "util") pod "b1331095-4b25-4606-8b82-fdd6f9fa5395" (UID: "b1331095-4b25-4606-8b82-fdd6f9fa5395"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:29.039091 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.039073 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-util" (OuterVolumeSpecName: "util") pod "930915a3-a064-49e9-9979-63cbebb3fa05" (UID: "930915a3-a064-49e9-9979-63cbebb3fa05"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:29.131401 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.131374 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-util\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.131401 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.131398 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-97bgd\" (UniqueName: \"kubernetes.io/projected/930915a3-a064-49e9-9979-63cbebb3fa05-kube-api-access-97bgd\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.131593 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.131410 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8v2rr\" (UniqueName: \"kubernetes.io/projected/a8086120-d977-4efd-b29d-cdf897bddd1c-kube-api-access-8v2rr\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.131593 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.131420 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-bundle\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.131593 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.131429 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8086120-d977-4efd-b29d-cdf897bddd1c-util\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.131593 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.131437 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vhlgf\" (UniqueName: \"kubernetes.io/projected/b1331095-4b25-4606-8b82-fdd6f9fa5395-kube-api-access-vhlgf\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.131593 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.131445 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/930915a3-a064-49e9-9979-63cbebb3fa05-util\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.131593 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.131454 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1331095-4b25-4606-8b82-fdd6f9fa5395-bundle\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:23:29.775415 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.775378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" event={"ID":"b1331095-4b25-4606-8b82-fdd6f9fa5395","Type":"ContainerDied","Data":"dc5ebefdf1352d67e0c0c9c0bf3f8fb72a1057bafe5e497f1ca652170887d266"} Apr 16 18:23:29.775415 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.775411 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc5ebefdf1352d67e0c0c9c0bf3f8fb72a1057bafe5e497f1ca652170887d266" Apr 16 18:23:29.775655 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.775435 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88vvfpc" Apr 16 18:23:29.777119 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.777093 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" event={"ID":"dce716a3-e7d9-403d-a13e-f5d329c0d21a","Type":"ContainerDied","Data":"723e9522abf885fa6d3f908eace7c539baaa1faa4279e8f6d5b70cedf1568634"} Apr 16 18:23:29.777119 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.777111 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503zrtxc" Apr 16 18:23:29.777119 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.777122 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="723e9522abf885fa6d3f908eace7c539baaa1faa4279e8f6d5b70cedf1568634" Apr 16 18:23:29.778966 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.778907 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" event={"ID":"930915a3-a064-49e9-9979-63cbebb3fa05","Type":"ContainerDied","Data":"59013fc6f55cd17907f1e5b713182636871e04fda48f85fecf774cb2be8d161a"} Apr 16 18:23:29.778966 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.778938 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59013fc6f55cd17907f1e5b713182636871e04fda48f85fecf774cb2be8d161a" Apr 16 18:23:29.778966 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.778944 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpcbp5" Apr 16 18:23:29.781012 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.780990 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" event={"ID":"a8086120-d977-4efd-b29d-cdf897bddd1c","Type":"ContainerDied","Data":"a3d08aff6dc82caeda846cd3cfc60fd217f51c9708c1ea17449182d67b87f6ff"} Apr 16 18:23:29.781137 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.781119 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3d08aff6dc82caeda846cd3cfc60fd217f51c9708c1ea17449182d67b87f6ff" Apr 16 18:23:29.781137 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:29.781072 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30l7hjr" Apr 16 18:23:34.942156 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942118 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8"] Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942451 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1331095-4b25-4606-8b82-fdd6f9fa5395" containerName="util" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942463 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1331095-4b25-4606-8b82-fdd6f9fa5395" containerName="util" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942473 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="930915a3-a064-49e9-9979-63cbebb3fa05" containerName="util" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942479 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="930915a3-a064-49e9-9979-63cbebb3fa05" containerName="util" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942486 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dce716a3-e7d9-403d-a13e-f5d329c0d21a" containerName="util" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942493 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dce716a3-e7d9-403d-a13e-f5d329c0d21a" containerName="util" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942501 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="930915a3-a064-49e9-9979-63cbebb3fa05" containerName="extract" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942506 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="930915a3-a064-49e9-9979-63cbebb3fa05" containerName="extract" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942534 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dce716a3-e7d9-403d-a13e-f5d329c0d21a" containerName="extract" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942539 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dce716a3-e7d9-403d-a13e-f5d329c0d21a" containerName="extract" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942545 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1331095-4b25-4606-8b82-fdd6f9fa5395" containerName="extract" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942550 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1331095-4b25-4606-8b82-fdd6f9fa5395" containerName="extract" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942559 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dce716a3-e7d9-403d-a13e-f5d329c0d21a" containerName="pull" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942564 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dce716a3-e7d9-403d-a13e-f5d329c0d21a" containerName="pull" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942570 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8086120-d977-4efd-b29d-cdf897bddd1c" containerName="util" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942575 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8086120-d977-4efd-b29d-cdf897bddd1c" containerName="util" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942581 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1331095-4b25-4606-8b82-fdd6f9fa5395" containerName="pull" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942585 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1331095-4b25-4606-8b82-fdd6f9fa5395" containerName="pull" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942591 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="930915a3-a064-49e9-9979-63cbebb3fa05" containerName="pull" Apr 16 18:23:34.942591 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942596 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="930915a3-a064-49e9-9979-63cbebb3fa05" containerName="pull" Apr 16 18:23:34.943206 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942609 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8086120-d977-4efd-b29d-cdf897bddd1c" containerName="pull" Apr 16 18:23:34.943206 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942614 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8086120-d977-4efd-b29d-cdf897bddd1c" containerName="pull" Apr 16 18:23:34.943206 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942620 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8086120-d977-4efd-b29d-cdf897bddd1c" containerName="extract" Apr 16 18:23:34.943206 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942625 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8086120-d977-4efd-b29d-cdf897bddd1c" containerName="extract" Apr 16 18:23:34.943206 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942675 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1331095-4b25-4606-8b82-fdd6f9fa5395" containerName="extract" Apr 16 18:23:34.943206 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942683 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="930915a3-a064-49e9-9979-63cbebb3fa05" containerName="extract" Apr 16 18:23:34.943206 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942690 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8086120-d977-4efd-b29d-cdf897bddd1c" containerName="extract" Apr 16 18:23:34.943206 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.942696 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="dce716a3-e7d9-403d-a13e-f5d329c0d21a" containerName="extract" Apr 16 18:23:34.947059 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.947037 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8" Apr 16 18:23:34.949227 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.949205 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 18:23:34.949709 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.949694 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 18:23:34.949799 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.949726 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-tkfsj\"" Apr 16 18:23:34.957652 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.957632 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8"] Apr 16 18:23:34.982577 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:34.982552 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zcn4\" (UniqueName: \"kubernetes.io/projected/11b5abef-e75d-4508-8c33-fce114ff5916-kube-api-access-2zcn4\") pod \"limitador-operator-controller-manager-c7fb4c8d5-p5wd8\" (UID: \"11b5abef-e75d-4508-8c33-fce114ff5916\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8" Apr 16 18:23:35.083687 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:35.083655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zcn4\" (UniqueName: \"kubernetes.io/projected/11b5abef-e75d-4508-8c33-fce114ff5916-kube-api-access-2zcn4\") pod \"limitador-operator-controller-manager-c7fb4c8d5-p5wd8\" (UID: \"11b5abef-e75d-4508-8c33-fce114ff5916\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8" Apr 16 18:23:35.095974 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:35.095941 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zcn4\" (UniqueName: \"kubernetes.io/projected/11b5abef-e75d-4508-8c33-fce114ff5916-kube-api-access-2zcn4\") pod \"limitador-operator-controller-manager-c7fb4c8d5-p5wd8\" (UID: \"11b5abef-e75d-4508-8c33-fce114ff5916\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8" Apr 16 18:23:35.258311 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:35.258224 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8" Apr 16 18:23:35.412669 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:35.412646 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8"] Apr 16 18:23:35.414933 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:23:35.414904 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11b5abef_e75d_4508_8c33_fce114ff5916.slice/crio-700afeb2a45f48dcbc8899be0954fea3b6055448b04ab447901951dfed230ec5 WatchSource:0}: Error finding container 700afeb2a45f48dcbc8899be0954fea3b6055448b04ab447901951dfed230ec5: Status 404 returned error can't find the container with id 700afeb2a45f48dcbc8899be0954fea3b6055448b04ab447901951dfed230ec5 Apr 16 18:23:35.808569 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:35.808510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8" event={"ID":"11b5abef-e75d-4508-8c33-fce114ff5916","Type":"ContainerStarted","Data":"700afeb2a45f48dcbc8899be0954fea3b6055448b04ab447901951dfed230ec5"} Apr 16 18:23:37.469058 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:37.469019 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-vwt8c"] Apr 16 18:23:37.481168 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:37.481142 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-vwt8c" Apr 16 18:23:37.485083 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:37.484694 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-7bm6f\"" Apr 16 18:23:37.489485 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:37.489391 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-vwt8c"] Apr 16 18:23:37.607408 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:37.607371 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvkzn\" (UniqueName: \"kubernetes.io/projected/9efd324f-c783-4dd9-82a6-85d4a9fa8b88-kube-api-access-fvkzn\") pod \"authorino-operator-7587b89b76-vwt8c\" (UID: \"9efd324f-c783-4dd9-82a6-85d4a9fa8b88\") " pod="kuadrant-system/authorino-operator-7587b89b76-vwt8c" Apr 16 18:23:37.708805 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:37.708764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvkzn\" (UniqueName: \"kubernetes.io/projected/9efd324f-c783-4dd9-82a6-85d4a9fa8b88-kube-api-access-fvkzn\") pod \"authorino-operator-7587b89b76-vwt8c\" (UID: \"9efd324f-c783-4dd9-82a6-85d4a9fa8b88\") " pod="kuadrant-system/authorino-operator-7587b89b76-vwt8c" Apr 16 18:23:37.719901 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:37.719820 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvkzn\" (UniqueName: \"kubernetes.io/projected/9efd324f-c783-4dd9-82a6-85d4a9fa8b88-kube-api-access-fvkzn\") pod \"authorino-operator-7587b89b76-vwt8c\" (UID: \"9efd324f-c783-4dd9-82a6-85d4a9fa8b88\") " pod="kuadrant-system/authorino-operator-7587b89b76-vwt8c" Apr 16 18:23:37.798458 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:37.798265 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-vwt8c" Apr 16 18:23:37.960277 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:37.960244 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-vwt8c"] Apr 16 18:23:37.963977 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:23:37.963942 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9efd324f_c783_4dd9_82a6_85d4a9fa8b88.slice/crio-c0269b6f5059246ea166707bfba6703409812f89c11b6719753cac8996f10a4f WatchSource:0}: Error finding container c0269b6f5059246ea166707bfba6703409812f89c11b6719753cac8996f10a4f: Status 404 returned error can't find the container with id c0269b6f5059246ea166707bfba6703409812f89c11b6719753cac8996f10a4f Apr 16 18:23:38.822365 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:38.822322 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-vwt8c" event={"ID":"9efd324f-c783-4dd9-82a6-85d4a9fa8b88","Type":"ContainerStarted","Data":"c0269b6f5059246ea166707bfba6703409812f89c11b6719753cac8996f10a4f"} Apr 16 18:23:38.823817 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:38.823789 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8" event={"ID":"11b5abef-e75d-4508-8c33-fce114ff5916","Type":"ContainerStarted","Data":"d98d67d819c3c6017c8c35643edc140b0c43f4ece5144bee758babf5123ff8bd"} Apr 16 18:23:38.823979 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:38.823962 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8" Apr 16 18:23:38.842581 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:38.842509 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8" podStartSLOduration=2.408427301 podStartE2EDuration="4.842494252s" podCreationTimestamp="2026-04-16 18:23:34 +0000 UTC" firstStartedPulling="2026-04-16 18:23:35.41736072 +0000 UTC m=+486.773502756" lastFinishedPulling="2026-04-16 18:23:37.851427658 +0000 UTC m=+489.207569707" observedRunningTime="2026-04-16 18:23:38.840353423 +0000 UTC m=+490.196495480" watchObservedRunningTime="2026-04-16 18:23:38.842494252 +0000 UTC m=+490.198636309" Apr 16 18:23:39.829875 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:39.829842 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-vwt8c" event={"ID":"9efd324f-c783-4dd9-82a6-85d4a9fa8b88","Type":"ContainerStarted","Data":"8c604381a7ac3e19ef563f39594abdb8e6079023235a17c86d08e01855526141"} Apr 16 18:23:39.830329 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:39.830116 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-vwt8c" Apr 16 18:23:39.847674 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:39.847627 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-vwt8c" podStartSLOduration=1.09978322 podStartE2EDuration="2.847611242s" podCreationTimestamp="2026-04-16 18:23:37 +0000 UTC" firstStartedPulling="2026-04-16 18:23:37.966879665 +0000 UTC m=+489.323021700" lastFinishedPulling="2026-04-16 18:23:39.714707671 +0000 UTC m=+491.070849722" observedRunningTime="2026-04-16 18:23:39.845348629 +0000 UTC m=+491.201490685" watchObservedRunningTime="2026-04-16 18:23:39.847611242 +0000 UTC m=+491.203753298" Apr 16 18:23:49.832896 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:49.832864 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-p5wd8" Apr 16 18:23:50.836210 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:50.836177 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-vwt8c" Apr 16 18:23:58.077841 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:23:58.077795 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f75b9848f-g9vnt"] Apr 16 18:24:23.100058 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.099953 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f75b9848f-g9vnt" podUID="84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" containerName="console" containerID="cri-o://6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769" gracePeriod=15 Apr 16 18:24:23.338900 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.338870 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f75b9848f-g9vnt_84e18b78-2c41-49e4-8c8a-f6bb9b134d7a/console/0.log" Apr 16 18:24:23.339021 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.338938 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:24:23.401440 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.401410 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn6kj\" (UniqueName: \"kubernetes.io/projected/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-kube-api-access-jn6kj\") pod \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " Apr 16 18:24:23.401629 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.401448 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-service-ca\") pod \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " Apr 16 18:24:23.401629 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.401576 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-oauth-config\") pod \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " Apr 16 18:24:23.401629 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.401619 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-serving-cert\") pod \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " Apr 16 18:24:23.401797 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.401659 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-config\") pod \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " Apr 16 18:24:23.401797 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.401705 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-trusted-ca-bundle\") pod \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " Apr 16 18:24:23.401797 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.401728 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-oauth-serving-cert\") pod \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\" (UID: \"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a\") " Apr 16 18:24:23.401950 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.401867 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-service-ca" (OuterVolumeSpecName: "service-ca") pod "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" (UID: "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:24:23.402171 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.402107 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-service-ca\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:24:23.402171 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.402116 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-config" (OuterVolumeSpecName: "console-config") pod "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" (UID: "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:24:23.402171 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.402157 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" (UID: "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:24:23.402579 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.402174 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" (UID: "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:24:23.403941 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.403910 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" (UID: "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:24:23.404199 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.404178 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" (UID: "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:24:23.404261 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.404245 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-kube-api-access-jn6kj" (OuterVolumeSpecName: "kube-api-access-jn6kj") pod "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" (UID: "84e18b78-2c41-49e4-8c8a-f6bb9b134d7a"). InnerVolumeSpecName "kube-api-access-jn6kj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:24:23.503040 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.503000 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-trusted-ca-bundle\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:24:23.503040 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.503033 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-oauth-serving-cert\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:24:23.503040 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.503045 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jn6kj\" (UniqueName: \"kubernetes.io/projected/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-kube-api-access-jn6kj\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:24:23.503286 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.503059 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-oauth-config\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:24:23.503286 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.503071 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-serving-cert\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:24:23.503286 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.503087 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a-console-config\") on node \"ip-10-0-130-205.ec2.internal\" DevicePath \"\"" Apr 16 18:24:23.990557 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.990504 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f75b9848f-g9vnt_84e18b78-2c41-49e4-8c8a-f6bb9b134d7a/console/0.log" Apr 16 18:24:23.990721 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.990571 2573 generic.go:358] "Generic (PLEG): container finished" podID="84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" containerID="6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769" exitCode=2 Apr 16 18:24:23.990721 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.990603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f75b9848f-g9vnt" event={"ID":"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a","Type":"ContainerDied","Data":"6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769"} Apr 16 18:24:23.990721 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.990637 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f75b9848f-g9vnt" Apr 16 18:24:23.990721 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.990651 2573 scope.go:117] "RemoveContainer" containerID="6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769" Apr 16 18:24:23.990881 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.990641 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f75b9848f-g9vnt" event={"ID":"84e18b78-2c41-49e4-8c8a-f6bb9b134d7a","Type":"ContainerDied","Data":"c0ce560a996676839c29825b1b5097a6140a24cad8a1327d7ad3f995b892d9d2"} Apr 16 18:24:23.999747 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.999732 2573 scope.go:117] "RemoveContainer" containerID="6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769" Apr 16 18:24:23.999975 ip-10-0-130-205 kubenswrapper[2573]: E0416 18:24:23.999960 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769\": container with ID starting with 6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769 not found: ID does not exist" containerID="6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769" Apr 16 18:24:24.000036 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:23.999981 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769"} err="failed to get container status \"6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769\": rpc error: code = NotFound desc = could not find container \"6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769\": container with ID starting with 6be520dfb054b6775e8e826777acac91d63b0961ee3d3be7f271edfff0f88769 not found: ID does not exist" Apr 16 18:24:24.016211 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:24.016153 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f75b9848f-g9vnt"] Apr 16 18:24:24.019028 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:24.019005 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f75b9848f-g9vnt"] Apr 16 18:24:25.222194 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:24:25.222158 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" path="/var/lib/kubelet/pods/84e18b78-2c41-49e4-8c8a-f6bb9b134d7a/volumes" Apr 16 18:33:57.146153 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:33:57.146122 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-7cd77c7ffd-6pj7l_f90a552b-6669-4d7f-9d2a-6ed675ff2a01/discovery/0.log" Apr 16 18:33:57.159375 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:33:57.159341 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-btdrn_1b515d6e-9256-4415-9c2b-b201c18d1744/istio-proxy/0.log" Apr 16 18:33:57.176447 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:33:57.176407 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-596bbb7b7d-nk24r_23253796-d542-44e0-93b4-6b1d65c09948/router/0.log" Apr 16 18:34:04.199105 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:04.199073 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tphmr_5366b941-9d0a-4457-8229-086d574fc5ab/global-pull-secret-syncer/0.log" Apr 16 18:34:04.239752 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:04.239712 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7v7nr_aec539b7-c282-46ac-8eff-3bb0c203088a/konnectivity-agent/0.log" Apr 16 18:34:04.328851 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:04.328822 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-205.ec2.internal_f9f7c92f985c41749a276f7d4f6cddbd/haproxy/0.log" Apr 16 18:34:08.700827 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:08.700791 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-4szbw_eff34aa9-7480-480d-b76a-e58afdd3fc46/cluster-monitoring-operator/0.log" Apr 16 18:34:08.726857 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:08.726826 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-fmlhz_44ea6a5b-fa62-492e-8885-5836fde6aae9/kube-state-metrics/0.log" Apr 16 18:34:08.753546 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:08.753499 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-fmlhz_44ea6a5b-fa62-492e-8885-5836fde6aae9/kube-rbac-proxy-main/0.log" Apr 16 18:34:08.781462 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:08.781429 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-fmlhz_44ea6a5b-fa62-492e-8885-5836fde6aae9/kube-rbac-proxy-self/0.log" Apr 16 18:34:08.996800 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:08.996717 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4lmm7_ae6e84d4-e669-49d1-8b06-5404b460d95f/node-exporter/0.log" Apr 16 18:34:09.022095 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.022056 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4lmm7_ae6e84d4-e669-49d1-8b06-5404b460d95f/kube-rbac-proxy/0.log" Apr 16 18:34:09.044704 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.044673 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4lmm7_ae6e84d4-e669-49d1-8b06-5404b460d95f/init-textfile/0.log" Apr 16 18:34:09.197527 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.197478 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-89djw_a10b83fc-c6ce-49cc-9a26-8aa7bc948c27/kube-rbac-proxy-main/0.log" Apr 16 18:34:09.220855 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.220822 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-89djw_a10b83fc-c6ce-49cc-9a26-8aa7bc948c27/kube-rbac-proxy-self/0.log" Apr 16 18:34:09.246417 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.246365 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-89djw_a10b83fc-c6ce-49cc-9a26-8aa7bc948c27/openshift-state-metrics/0.log" Apr 16 18:34:09.511112 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.511051 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-45887_cb136e76-6bdc-4e39-b76b-d994ee40f867/prometheus-operator/0.log" Apr 16 18:34:09.530797 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.530766 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-45887_cb136e76-6bdc-4e39-b76b-d994ee40f867/kube-rbac-proxy/0.log" Apr 16 18:34:09.718871 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.718840 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d9f96dc59-snn89_fd636b06-9e7a-4f60-ad21-00cd7e60a593/thanos-query/0.log" Apr 16 18:34:09.745074 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.745046 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d9f96dc59-snn89_fd636b06-9e7a-4f60-ad21-00cd7e60a593/kube-rbac-proxy-web/0.log" Apr 16 18:34:09.771157 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.771065 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d9f96dc59-snn89_fd636b06-9e7a-4f60-ad21-00cd7e60a593/kube-rbac-proxy/0.log" Apr 16 18:34:09.799305 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.799276 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d9f96dc59-snn89_fd636b06-9e7a-4f60-ad21-00cd7e60a593/prom-label-proxy/0.log" Apr 16 18:34:09.821721 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.821689 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d9f96dc59-snn89_fd636b06-9e7a-4f60-ad21-00cd7e60a593/kube-rbac-proxy-rules/0.log" Apr 16 18:34:09.844885 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:09.844855 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d9f96dc59-snn89_fd636b06-9e7a-4f60-ad21-00cd7e60a593/kube-rbac-proxy-metrics/0.log" Apr 16 18:34:11.387193 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:11.387148 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-h9xxg_71df2187-c914-4b58-8d61-6fcaacaefd11/networking-console-plugin/0.log" Apr 16 18:34:12.964331 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:12.964301 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-ht7hf_194cfdef-87c9-4fe6-b52f-1a569ef6e306/volume-data-source-validator/0.log" Apr 16 18:34:13.166554 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.166506 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b"] Apr 16 18:34:13.166893 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.166881 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" containerName="console" Apr 16 18:34:13.166941 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.166894 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" containerName="console" Apr 16 18:34:13.166975 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.166948 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="84e18b78-2c41-49e4-8c8a-f6bb9b134d7a" containerName="console" Apr 16 18:34:13.170192 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.170168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.172218 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.172200 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5b6zt\"/\"default-dockercfg-nlsgc\"" Apr 16 18:34:13.172702 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.172683 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5b6zt\"/\"kube-root-ca.crt\"" Apr 16 18:34:13.172801 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.172734 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5b6zt\"/\"openshift-service-ca.crt\"" Apr 16 18:34:13.182731 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.182706 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b"] Apr 16 18:34:13.306555 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.306442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-proc\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.306555 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.306498 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-sys\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.306743 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.306609 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c88dw\" (UniqueName: \"kubernetes.io/projected/e126741a-5b92-47f0-bd52-f07ae014e7fc-kube-api-access-c88dw\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.306743 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.306644 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-lib-modules\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.306743 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.306661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-podres\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.407442 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.407400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c88dw\" (UniqueName: \"kubernetes.io/projected/e126741a-5b92-47f0-bd52-f07ae014e7fc-kube-api-access-c88dw\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.407442 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.407444 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-lib-modules\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.407682 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.407468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-podres\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.407682 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.407553 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-proc\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.407682 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.407602 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-sys\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.407682 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.407643 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-podres\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.407682 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.407646 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-lib-modules\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.407849 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.407686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-sys\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.407849 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.407689 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e126741a-5b92-47f0-bd52-f07ae014e7fc-proc\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.415835 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.415813 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c88dw\" (UniqueName: \"kubernetes.io/projected/e126741a-5b92-47f0-bd52-f07ae014e7fc-kube-api-access-c88dw\") pod \"perf-node-gather-daemonset-2kf4b\" (UID: \"e126741a-5b92-47f0-bd52-f07ae014e7fc\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.480971 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.480938 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:13.610474 ip-10-0-130-205 kubenswrapper[2573]: W0416 18:34:13.610437 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode126741a_5b92_47f0_bd52_f07ae014e7fc.slice/crio-8bfcb4d32c2e5d939d73433bb7013e4175c3831dc99c6f17be2823331ab3afac WatchSource:0}: Error finding container 8bfcb4d32c2e5d939d73433bb7013e4175c3831dc99c6f17be2823331ab3afac: Status 404 returned error can't find the container with id 8bfcb4d32c2e5d939d73433bb7013e4175c3831dc99c6f17be2823331ab3afac Apr 16 18:34:13.612126 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.612110 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:34:13.612542 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.612495 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b"] Apr 16 18:34:13.847406 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.847311 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bndvz_0913fe98-7bbc-41d3-9144-086892d07104/dns/0.log" Apr 16 18:34:13.872766 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.872743 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bndvz_0913fe98-7bbc-41d3-9144-086892d07104/kube-rbac-proxy/0.log" Apr 16 18:34:13.994396 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:13.994362 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jl2lc_3d37a2e5-3988-4400-95f4-1baaf11b42a8/dns-node-resolver/0.log" Apr 16 18:34:14.192232 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:14.192197 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" event={"ID":"e126741a-5b92-47f0-bd52-f07ae014e7fc","Type":"ContainerStarted","Data":"9bbd3fac8405a74da5d6c6d03f98b14259c1f63198c73464040d9b51b9ba9e90"} Apr 16 18:34:14.192232 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:14.192237 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" event={"ID":"e126741a-5b92-47f0-bd52-f07ae014e7fc","Type":"ContainerStarted","Data":"8bfcb4d32c2e5d939d73433bb7013e4175c3831dc99c6f17be2823331ab3afac"} Apr 16 18:34:14.192438 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:14.192317 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:14.211078 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:14.211003 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" podStartSLOduration=1.210987253 podStartE2EDuration="1.210987253s" podCreationTimestamp="2026-04-16 18:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:34:14.209351164 +0000 UTC m=+1125.565493221" watchObservedRunningTime="2026-04-16 18:34:14.210987253 +0000 UTC m=+1125.567129309" Apr 16 18:34:14.564758 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:14.564673 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kd96x_e4ae6d82-301f-44be-85ce-8d3b88e0d6e1/node-ca/0.log" Apr 16 18:34:15.445788 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:15.445757 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-7cd77c7ffd-6pj7l_f90a552b-6669-4d7f-9d2a-6ed675ff2a01/discovery/0.log" Apr 16 18:34:15.481491 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:15.481461 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-btdrn_1b515d6e-9256-4415-9c2b-b201c18d1744/istio-proxy/0.log" Apr 16 18:34:15.514269 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:15.514245 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-596bbb7b7d-nk24r_23253796-d542-44e0-93b4-6b1d65c09948/router/0.log" Apr 16 18:34:15.997646 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:15.997613 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7vbzm_8d155535-fa59-4777-80cf-fdba34134958/serve-healthcheck-canary/0.log" Apr 16 18:34:16.474474 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:16.474438 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-kjrt4_3901ff32-acaf-4296-9b6e-811ec88ce688/insights-operator/1.log" Apr 16 18:34:16.474871 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:16.474486 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-kjrt4_3901ff32-acaf-4296-9b6e-811ec88ce688/insights-operator/0.log" Apr 16 18:34:16.676693 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:16.676661 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v76fn_af10a82c-41b9-4f94-83b7-46e390179f35/kube-rbac-proxy/0.log" Apr 16 18:34:16.699812 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:16.699784 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v76fn_af10a82c-41b9-4f94-83b7-46e390179f35/exporter/0.log" Apr 16 18:34:16.724262 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:16.724233 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v76fn_af10a82c-41b9-4f94-83b7-46e390179f35/extractor/0.log" Apr 16 18:34:20.206212 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:20.206185 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-2kf4b" Apr 16 18:34:24.731137 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:24.731054 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6sjm7_37634bfc-74ef-4ed7-916d-20e219934bbf/kube-multus/0.log" Apr 16 18:34:24.963727 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:24.963696 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5xfmp_d3825245-f2b4-4372-9135-56f1b2145871/kube-multus-additional-cni-plugins/0.log" Apr 16 18:34:24.985343 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:24.985269 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5xfmp_d3825245-f2b4-4372-9135-56f1b2145871/egress-router-binary-copy/0.log" Apr 16 18:34:25.007659 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:25.007634 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5xfmp_d3825245-f2b4-4372-9135-56f1b2145871/cni-plugins/0.log" Apr 16 18:34:25.029613 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:25.029584 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5xfmp_d3825245-f2b4-4372-9135-56f1b2145871/bond-cni-plugin/0.log" Apr 16 18:34:25.054035 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:25.054001 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5xfmp_d3825245-f2b4-4372-9135-56f1b2145871/routeoverride-cni/0.log" Apr 16 18:34:25.075199 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:25.075170 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5xfmp_d3825245-f2b4-4372-9135-56f1b2145871/whereabouts-cni-bincopy/0.log" Apr 16 18:34:25.096241 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:25.096213 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5xfmp_d3825245-f2b4-4372-9135-56f1b2145871/whereabouts-cni/0.log" Apr 16 18:34:25.418586 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:25.418552 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j9gkk_3fd8f3ce-1a67-4a38-99ec-e368aea03088/network-metrics-daemon/0.log" Apr 16 18:34:25.445254 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:25.445224 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j9gkk_3fd8f3ce-1a67-4a38-99ec-e368aea03088/kube-rbac-proxy/0.log" Apr 16 18:34:26.682077 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:26.682048 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-slsjs_2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7/ovn-controller/0.log" Apr 16 18:34:26.710919 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:26.710890 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-slsjs_2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7/ovn-acl-logging/0.log" Apr 16 18:34:26.736939 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:26.736896 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-slsjs_2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7/kube-rbac-proxy-node/0.log" Apr 16 18:34:26.764818 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:26.764784 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-slsjs_2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:34:26.784562 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:26.784528 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-slsjs_2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7/northd/0.log" Apr 16 18:34:26.806464 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:26.806431 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-slsjs_2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7/nbdb/0.log" Apr 16 18:34:26.828290 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:26.828262 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-slsjs_2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7/sbdb/0.log" Apr 16 18:34:27.023907 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:27.023827 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-slsjs_2689c9d7-7f3a-4fba-ae4a-9b98a2133fb7/ovnkube-controller/0.log" Apr 16 18:34:28.495833 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:28.495802 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4cbj8_619bc9de-2915-4bce-b443-702d489e89af/network-check-target-container/0.log" Apr 16 18:34:29.656375 ip-10-0-130-205 kubenswrapper[2573]: I0416 18:34:29.656347 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-9rvl2_5e12a06f-d736-4229-bb5a-3066805a1732/iptables-alerter/0.log"