Apr 16 18:02:53.522865 ip-10-0-131-193 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:02:53.931566 ip-10-0-131-193 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:53.931566 ip-10-0-131-193 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:02:53.931566 ip-10-0-131-193 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:53.931566 ip-10-0-131-193 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:02:53.931566 ip-10-0-131-193 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:53.933125 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.933031 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:02:53.935301 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935284 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:53.935301 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935301 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935306 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935311 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935314 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935317 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935320 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935324 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935326 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935330 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935332 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935336 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935339 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935342 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935345 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935347 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935350 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935353 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935355 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935358 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:53.935362 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935360 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935364 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935366 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935369 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935372 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935375 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935377 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935380 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935382 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935385 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935388 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935390 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935393 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935396 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935398 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935401 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935404 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935406 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935417 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935420 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:53.935840 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935422 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935425 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935427 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935430 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935432 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935434 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935437 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935439 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935442 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935445 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935448 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935450 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935453 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935455 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935459 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935462 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935465 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935468 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935470 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935473 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:53.936315 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935476 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935479 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935481 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935484 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935486 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935489 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935493 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935495 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935500 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935504 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935506 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935509 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935511 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935514 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935517 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935519 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935521 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935524 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935527 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:53.937086 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935530 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935532 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935535 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935537 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935540 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935542 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935545 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935953 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935958 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935961 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935964 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935967 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935970 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935973 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935975 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935979 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935983 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935985 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935988 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:53.937626 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935991 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935994 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935997 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.935999 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936002 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936005 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936007 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936010 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936012 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936015 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936017 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936020 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936022 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936025 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936028 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936030 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936032 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936035 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936038 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936041 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:53.938160 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936044 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936047 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936049 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936052 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936054 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936057 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936059 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936062 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936066 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936068 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936071 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936073 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936076 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936079 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936081 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936084 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936086 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936089 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936092 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936094 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:53.938706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936097 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936099 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936101 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936104 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936106 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936109 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936111 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936113 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936116 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936120 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936124 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936127 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936130 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936133 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936136 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936138 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936141 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936143 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936146 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:53.939219 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936148 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936151 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936154 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936157 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936159 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936162 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936164 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936167 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936170 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936172 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936174 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936177 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936179 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936182 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.936184 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.936951 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.936965 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.936973 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.936978 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.936982 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.936986 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:02:53.939706 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.936990 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.936995 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.936999 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937003 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937006 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937010 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937013 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937016 2564 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937019 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937022 2564 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937025 2564 flags.go:64] FLAG: --cloud-config="" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937027 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937030 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937034 2564 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937037 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937040 2564 flags.go:64] FLAG: --config-dir="" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937043 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937046 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937050 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937054 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937057 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937061 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937064 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937067 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:02:53.940217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937070 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937073 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937076 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937080 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937083 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937086 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937089 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937092 2564 flags.go:64] FLAG: --enable-server="true" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937095 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937100 2564 flags.go:64] FLAG: --event-burst="100" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937104 2564 flags.go:64] FLAG: --event-qps="50" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937107 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937111 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937114 2564 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937118 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937121 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937124 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937127 2564 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937130 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937133 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937136 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937140 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937143 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937146 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937149 2564 flags.go:64] FLAG: --feature-gates="" Apr 16 18:02:53.940801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937153 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937156 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937159 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937165 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937168 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937171 2564 flags.go:64] FLAG: --help="false" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937174 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-131-193.ec2.internal" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937177 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937180 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937183 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937186 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937190 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937193 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937196 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937199 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937201 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937204 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937208 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937211 2564 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937214 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937217 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937220 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937223 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937226 2564 flags.go:64] FLAG: --lock-file="" Apr 16 18:02:53.941399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937229 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937232 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937235 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937241 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937244 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937246 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937249 2564 flags.go:64] FLAG: --logging-format="text" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937252 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937256 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937258 2564 flags.go:64] FLAG: --manifest-url="" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937261 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937271 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937274 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937279 2564 flags.go:64] FLAG: --max-pods="110" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937282 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937285 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937288 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937291 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937294 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937296 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937299 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937307 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937311 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937314 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:02:53.941999 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937317 2564 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937320 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937325 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937328 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937331 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937335 2564 flags.go:64] FLAG: --port="10250" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937338 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937341 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0290de475268b6d9a" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937344 2564 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937347 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937349 2564 flags.go:64] FLAG: --register-node="true" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937352 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937355 2564 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937359 2564 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937362 2564 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937365 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937367 2564 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937371 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937374 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937378 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937381 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937384 2564 flags.go:64] FLAG: --runonce="false" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937387 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937390 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937393 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:02:53.942634 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937396 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937399 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937402 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937405 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937409 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937412 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937415 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937418 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937421 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937427 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937430 2564 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937433 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937439 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937442 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937444 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937448 2564 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937451 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937454 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937457 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937460 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937463 2564 flags.go:64] FLAG: --v="2" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937467 2564 flags.go:64] FLAG: --version="false" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937471 2564 flags.go:64] FLAG: --vmodule="" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937475 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937478 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:02:53.943263 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937574 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937579 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937582 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937585 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937588 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937591 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937594 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937597 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937600 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937602 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937619 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937622 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937625 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937627 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937631 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937633 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937637 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937641 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937643 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937646 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:53.943901 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937649 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937652 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937654 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937657 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937659 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937662 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937664 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937667 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937669 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937672 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937674 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937677 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937680 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937683 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937686 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937688 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937691 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937694 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937696 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:53.944489 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937699 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937702 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937704 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937707 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937709 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937712 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937714 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937717 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937719 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937723 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937726 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937728 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937731 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937733 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937736 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937738 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937741 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937743 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937746 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937748 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:53.944980 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937751 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937754 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937756 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937759 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937761 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937764 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937767 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937771 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937775 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937778 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937781 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937784 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937787 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937790 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937792 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937795 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937797 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937800 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937803 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937805 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:53.945481 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937807 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:53.945995 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937812 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:53.945995 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937814 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:53.945995 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937817 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:53.945995 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937819 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:53.945995 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937822 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:53.945995 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.937826 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:53.945995 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.937831 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:53.945995 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.945919 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:02:53.945995 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.945937 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:02:53.945995 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.945994 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:53.945995 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946000 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946004 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946007 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946010 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946014 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946018 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946021 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946024 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946027 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946030 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946033 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946035 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946039 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946042 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946044 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946047 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946050 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946052 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946055 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:53.946313 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946058 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946060 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946063 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946065 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946068 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946071 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946075 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946077 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946080 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946083 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946085 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946088 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946091 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946093 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946096 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946100 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946103 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946105 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946108 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:53.946880 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946110 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946113 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946115 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946117 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946120 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946122 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946125 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946128 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946130 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946133 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946135 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946138 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946140 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946143 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946145 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946148 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946150 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946153 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946155 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946158 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:53.947363 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946161 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946165 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946167 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946170 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946172 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946175 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946178 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946180 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946183 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946185 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946188 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946191 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946193 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946196 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946198 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946201 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946204 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946206 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946209 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946211 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:53.947866 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946214 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946217 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946220 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946222 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946224 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946227 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946230 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.946235 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946335 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946340 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946342 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946345 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946348 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946351 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946354 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946356 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:53.948337 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946359 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946361 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946364 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946366 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946369 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946372 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946374 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946377 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946379 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946382 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946385 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946387 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946390 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946392 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946395 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946397 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946400 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946403 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946406 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946408 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:53.948751 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946411 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946413 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946417 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946421 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946424 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946428 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946432 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946435 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946437 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946440 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946442 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946445 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946447 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946450 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946452 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946454 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946457 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946460 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946462 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:53.949240 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946465 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946467 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946470 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946472 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946475 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946477 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946480 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946482 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946485 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946487 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946489 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946492 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946495 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946497 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946500 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946502 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946504 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946507 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946509 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:53.949729 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946512 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946514 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946517 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946519 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946522 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946524 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946527 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946529 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946531 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946534 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946537 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946539 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946542 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946545 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946547 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946549 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946552 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946555 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946557 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:53.950179 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:53.946560 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:53.950649 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.946564 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:53.950649 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.947159 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:02:53.950649 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.949027 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:02:53.950649 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.950068 2564 server.go:1019] "Starting client certificate rotation" Apr 16 18:02:53.950649 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.950187 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:53.950649 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.950223 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:53.980067 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.980029 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:53.982390 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:53.982366 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:54.000931 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.000908 2564 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:02:54.005748 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.005731 2564 log.go:25] "Validated CRI v1 image API" Apr 16 18:02:54.006989 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.006975 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:02:54.012441 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.012415 2564 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 ca8bdbc2-0f5c-4379-85e1-725514664f56:/dev/nvme0n1p4 dfc46d37-a009-43b3-8734-41cba957f4fa:/dev/nvme0n1p3] Apr 16 18:02:54.012489 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.012442 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:02:54.018110 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.017993 2564 manager.go:217] Machine: {Timestamp:2026-04-16 18:02:54.01629503 +0000 UTC m=+0.386968928 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3108696 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b729c4802bcfc028ab3c29c43eaa4 SystemUUID:ec2b729c-4802-bcfc-028a-b3c29c43eaa4 BootID:43897626-4916-4956-9641-b2132dd34e7f Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b5:26:a7:c3:1f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b5:26:a7:c3:1f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fe:25:6e:cc:60:bd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:02:54.018110 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.018106 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:02:54.018243 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.018231 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:02:54.020014 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.019958 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:02:54.020415 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.020018 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-193.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:02:54.020484 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.020423 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:02:54.020484 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.020433 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:02:54.020484 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.020446 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:54.022134 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.022123 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:54.023508 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.023495 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:54.023815 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.023805 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:02:54.026329 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.026318 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:02:54.026367 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.026333 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:02:54.026367 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.026351 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:02:54.026367 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.026366 2564 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:02:54.026468 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.026379 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:02:54.027602 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.027589 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:54.027668 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.027621 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:54.027758 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.027741 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:54.031065 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.031048 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:02:54.032863 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.032848 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:02:54.034394 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034377 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:02:54.034451 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034401 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:02:54.034451 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034413 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:02:54.034451 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034426 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:02:54.034451 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034435 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:02:54.034451 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034443 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:02:54.034451 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034452 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:02:54.034736 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034458 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:02:54.034736 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034465 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:02:54.034736 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034471 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:02:54.034736 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034480 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:02:54.034736 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.034489 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:02:54.035399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.035385 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:02:54.035399 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.035396 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:02:54.039435 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.039421 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:02:54.039508 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.039458 2564 server.go:1295] "Started kubelet" Apr 16 18:02:54.039621 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.039550 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:02:54.039665 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.039589 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:02:54.039695 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.039681 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:02:54.040151 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.040127 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-193.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:02:54.040287 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.040235 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-193.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:02:54.040421 ip-10-0-131-193 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:02:54.040894 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.040416 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:02:54.041071 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.040885 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:02:54.041339 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.041323 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:02:54.046198 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.045269 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-193.ec2.internal.18a6e85979eb3f8f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-193.ec2.internal,UID:ip-10-0-131-193.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-193.ec2.internal,},FirstTimestamp:2026-04-16 18:02:54.039433103 +0000 UTC m=+0.410107002,LastTimestamp:2026-04-16 18:02:54.039433103 +0000 UTC m=+0.410107002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-193.ec2.internal,}" Apr 16 18:02:54.047778 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.047765 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:54.048252 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.048238 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:02:54.049057 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049037 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:02:54.049057 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049056 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:02:54.049359 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049133 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:02:54.049359 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049163 2564 factory.go:55] Registering systemd factory Apr 16 18:02:54.049359 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049174 2564 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:02:54.049359 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049200 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:02:54.049359 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049213 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:02:54.049359 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049151 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:02:54.049632 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049446 2564 factory.go:153] Registering CRI-O factory Apr 16 18:02:54.049632 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049462 2564 factory.go:223] Registration of the crio container factory successfully Apr 16 18:02:54.049632 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049488 2564 factory.go:103] Registering Raw factory Apr 16 18:02:54.049632 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049502 2564 manager.go:1196] Started watching for new ooms in manager Apr 16 18:02:54.049831 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.049811 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-193.ec2.internal\" not found" Apr 16 18:02:54.049934 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.049915 2564 manager.go:319] Starting recovery of all containers Apr 16 18:02:54.050025 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.050002 2564 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:02:54.061529 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.061457 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:02:54.061661 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.061518 2564 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-193.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:02:54.061661 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.061564 2564 manager.go:324] Recovery completed Apr 16 18:02:54.063624 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.063543 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9n8db" Apr 16 18:02:54.065800 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.065787 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:54.068290 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.068274 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:54.068342 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.068303 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:54.068342 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.068313 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:54.068802 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.068787 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:02:54.068802 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.068797 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:02:54.068907 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.068817 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:54.071535 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.071522 2564 policy_none.go:49] "None policy: Start" Apr 16 18:02:54.071596 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.071539 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:02:54.071596 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.071550 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:02:54.073678 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.073661 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9n8db" Apr 16 18:02:54.112225 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.112208 2564 manager.go:341] "Starting Device Plugin manager" Apr 16 18:02:54.116832 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.112318 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:02:54.116832 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.112332 2564 server.go:85] "Starting device plugin registration server" Apr 16 18:02:54.116832 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.112598 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:02:54.116832 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.112624 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:02:54.116832 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.112717 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:02:54.116832 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.112794 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:02:54.116832 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.112803 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:02:54.116832 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.113348 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:02:54.116832 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.113379 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-193.ec2.internal\" not found" Apr 16 18:02:54.186457 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.186370 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:02:54.187626 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.187594 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:02:54.187735 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.187633 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:02:54.187735 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.187657 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:02:54.187735 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.187664 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:02:54.187735 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.187696 2564 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:02:54.195243 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.195219 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:54.212937 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.212921 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:54.215041 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.215024 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:54.215110 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.215056 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:54.215110 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.215069 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:54.215110 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.215096 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.222575 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.222560 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.222645 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.222583 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-193.ec2.internal\": node \"ip-10-0-131-193.ec2.internal\" not found" Apr 16 18:02:54.251644 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.251603 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-193.ec2.internal\" not found" Apr 16 18:02:54.288278 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.288246 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-193.ec2.internal"] Apr 16 18:02:54.288421 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.288332 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:54.289282 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.289266 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:54.289370 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.289300 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:54.289370 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.289314 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:54.291486 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.291471 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:54.291649 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.291633 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.291700 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.291666 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:54.292248 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.292234 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:54.292313 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.292247 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:54.292313 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.292260 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:54.292313 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.292270 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:54.292313 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.292274 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:54.292313 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.292284 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:54.294772 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.294757 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.294847 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.294783 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:54.295438 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.295422 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:54.295502 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.295459 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:54.295502 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.295471 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:54.317246 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.317226 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-193.ec2.internal\" not found" node="ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.321698 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.321684 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-193.ec2.internal\" not found" node="ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.350902 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.350880 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5c150e8aee5bd9091901cfeb87f90560-config\") pod \"kube-apiserver-proxy-ip-10-0-131-193.ec2.internal\" (UID: \"5c150e8aee5bd9091901cfeb87f90560\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.350963 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.350910 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3369dae444aea437eceaaa1a41477d3d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal\" (UID: \"3369dae444aea437eceaaa1a41477d3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.350963 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.350932 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3369dae444aea437eceaaa1a41477d3d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal\" (UID: \"3369dae444aea437eceaaa1a41477d3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.352257 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.352243 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-193.ec2.internal\" not found" Apr 16 18:02:54.451100 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.451036 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3369dae444aea437eceaaa1a41477d3d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal\" (UID: \"3369dae444aea437eceaaa1a41477d3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.451100 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.451071 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3369dae444aea437eceaaa1a41477d3d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal\" (UID: \"3369dae444aea437eceaaa1a41477d3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.451100 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.451087 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5c150e8aee5bd9091901cfeb87f90560-config\") pod \"kube-apiserver-proxy-ip-10-0-131-193.ec2.internal\" (UID: \"5c150e8aee5bd9091901cfeb87f90560\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.451235 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.451126 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3369dae444aea437eceaaa1a41477d3d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal\" (UID: \"3369dae444aea437eceaaa1a41477d3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.451235 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.451156 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3369dae444aea437eceaaa1a41477d3d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal\" (UID: \"3369dae444aea437eceaaa1a41477d3d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.451235 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.451129 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5c150e8aee5bd9091901cfeb87f90560-config\") pod \"kube-apiserver-proxy-ip-10-0-131-193.ec2.internal\" (UID: \"5c150e8aee5bd9091901cfeb87f90560\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.453136 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.453123 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-193.ec2.internal\" not found" Apr 16 18:02:54.553755 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.553716 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-193.ec2.internal\" not found" Apr 16 18:02:54.620946 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.620916 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.624498 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.624484 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-193.ec2.internal" Apr 16 18:02:54.654646 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.654622 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-193.ec2.internal\" not found" Apr 16 18:02:54.755292 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.755193 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-193.ec2.internal\" not found" Apr 16 18:02:54.855706 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.855657 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-193.ec2.internal\" not found" Apr 16 18:02:54.950011 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.949979 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:02:54.950823 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:54.950114 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:02:54.956267 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:54.956247 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-193.ec2.internal\" not found" Apr 16 18:02:55.009187 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.009139 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:55.026649 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.026627 2564 apiserver.go:52] "Watching apiserver" Apr 16 18:02:55.036916 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.036893 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:02:55.037915 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.037893 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-jlhzz","openshift-multus/multus-additional-cni-plugins-lf54m","openshift-multus/multus-hwxs7","openshift-multus/network-metrics-daemon-2xjws","openshift-network-operator/iptables-alerter-q97cg","openshift-ovn-kubernetes/ovnkube-node-gxh6d","kube-system/konnectivity-agent-xgk8q","openshift-image-registry/node-ca-q8z42","openshift-network-diagnostics/network-check-target-xddxf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8"] Apr 16 18:02:55.040873 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.040859 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.042849 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.042831 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.043535 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.043514 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:55.043635 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.043579 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:55.043846 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.043833 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-269wx\"" Apr 16 18:02:55.044996 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.044979 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.045304 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.045278 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:02:55.045426 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.045393 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:02:55.045426 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.045396 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:02:55.045513 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.045419 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:02:55.045513 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.045492 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mkd72\"" Apr 16 18:02:55.045871 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.045855 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:02:55.047119 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.047098 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:02:55.047198 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.047180 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:02:55.047264 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.047248 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nh8xf\"" Apr 16 18:02:55.047322 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.047311 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:02:55.048347 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.048333 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:55.048801 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.048787 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" Apr 16 18:02:55.049239 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.049224 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-q97cg" Apr 16 18:02:55.051512 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.051496 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.052715 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.052699 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:55.052797 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.052753 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kslbq\"" Apr 16 18:02:55.052945 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.052930 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:02:55.052983 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.052935 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:55.053560 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.053548 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:02:55.054449 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054433 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-run-openvswitch\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.054515 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054455 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:02:55.054515 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054470 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-run-netns\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.054515 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054485 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-sys\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.054515 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054506 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-var-lib-kubelet\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.054664 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054538 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-daemon-config\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.054664 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054587 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-run-multus-certs\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.054664 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054642 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn77j\" (UniqueName: \"kubernetes.io/projected/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-kube-api-access-pn77j\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.054763 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054667 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-socket-dir-parent\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.054763 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054687 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-env-overrides\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.054763 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054711 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-modprobe-d\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.054763 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054724 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d775120d-67fa-413e-b39f-64f7ae31665d-etc-tuned\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.054763 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054741 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f6f788f-55c3-4a55-8674-eb821c1fc81e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.054763 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054754 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-system-cni-dir\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.055042 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054778 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-run-k8s-cni-cncf-io\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.055042 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054791 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-etc-openvswitch\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.055042 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054806 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-slash\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.055042 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054822 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4f6f788f-55c3-4a55-8674-eb821c1fc81e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.055042 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054860 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-node-log\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.055042 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054918 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knlb4\" (UniqueName: \"kubernetes.io/projected/d775120d-67fa-413e-b39f-64f7ae31665d-kube-api-access-knlb4\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.055042 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054952 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sr9z\" (UniqueName: \"kubernetes.io/projected/4f6f788f-55c3-4a55-8674-eb821c1fc81e-kube-api-access-9sr9z\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.055042 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.054976 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-hostroot\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.055042 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055000 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-kubelet\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055047 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-host\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055068 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-os-release\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055117 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdm6z\" (UniqueName: \"kubernetes.io/projected/06425cb3-b57a-4b2a-80fc-58a19bb8366e-kube-api-access-wdm6z\") pod \"iptables-alerter-q97cg\" (UID: \"06425cb3-b57a-4b2a-80fc-58a19bb8366e\") " pod="openshift-network-operator/iptables-alerter-q97cg" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055157 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-run-ovn\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055180 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-ovnkube-config\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055219 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-kubernetes\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055273 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-systemd\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055293 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-conf-dir\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055309 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrcx2\" (UniqueName: \"kubernetes.io/projected/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-kube-api-access-hrcx2\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055322 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-lib-modules\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055338 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-os-release\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055354 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-systemd-units\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055367 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-cni-netd\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055381 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-sysctl-conf\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.055409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055410 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-run\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055435 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d775120d-67fa-413e-b39f-64f7ae31665d-tmp\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055461 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055489 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-cni-binary-copy\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055544 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06425cb3-b57a-4b2a-80fc-58a19bb8366e-iptables-alerter-script\") pod \"iptables-alerter-q97cg\" (UID: \"06425cb3-b57a-4b2a-80fc-58a19bb8366e\") " pod="openshift-network-operator/iptables-alerter-q97cg" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055566 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-run-systemd\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055587 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-sysconfig\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055628 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-cnibin\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055651 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06425cb3-b57a-4b2a-80fc-58a19bb8366e-host-slash\") pod \"iptables-alerter-q97cg\" (UID: \"06425cb3-b57a-4b2a-80fc-58a19bb8366e\") " pod="openshift-network-operator/iptables-alerter-q97cg" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055675 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-log-socket\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055697 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055721 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055723 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-var-lib-kubelet\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055775 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-system-cni-dir\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055703 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q8z42" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055804 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-run-netns\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055840 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-ovnkube-script-lib\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.056068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055867 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f6f788f-55c3-4a55-8674-eb821c1fc81e-cni-binary-copy\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055887 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-cnibin\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055928 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qr4q\" (UniqueName: \"kubernetes.io/projected/47436224-f5c2-417d-bfdc-61bfb9b8939a-kube-api-access-7qr4q\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.055973 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-sysctl-d\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056000 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-var-lib-cni-bin\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056024 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-var-lib-cni-multus\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056055 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-var-lib-openvswitch\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056066 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056071 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-cni-dir\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056126 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-etc-kubernetes\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056136 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056161 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056199 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-cni-bin\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056238 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-ovn-node-metrics-cert\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056357 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:02:55.056818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.056572 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:02:55.058050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.058032 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:02:55.058126 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.058080 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:02:55.058473 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.058461 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:02:55.059048 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.059033 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kfgnr\"" Apr 16 18:02:55.059624 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.059589 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:02:55.059717 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.059595 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:02:55.059717 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.059634 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-h2kvz\"" Apr 16 18:02:55.060065 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.060019 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-pmbd2\"" Apr 16 18:02:55.060065 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.060040 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:02:55.060065 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.059629 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:02:55.061671 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.060667 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:02:55.061671 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.060844 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.063970 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.063696 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:55.063970 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.063788 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-193.ec2.internal" Apr 16 18:02:55.063970 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.063847 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal"] Apr 16 18:02:55.063970 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.063912 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:02:55.064220 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.063861 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:02:55.065230 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.064946 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nf4sf\"" Apr 16 18:02:55.065230 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.065161 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:02:55.069069 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.069053 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:55.073494 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.073480 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:55.073574 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.073560 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-193.ec2.internal"] Apr 16 18:02:55.075052 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.075027 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:57:54 +0000 UTC" deadline="2027-09-10 16:13:43.378627496 +0000 UTC" Apr 16 18:02:55.075119 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.075052 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12286h10m48.303578251s" Apr 16 18:02:55.093086 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.093068 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tlfhw" Apr 16 18:02:55.109180 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.109163 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tlfhw" Apr 16 18:02:55.149958 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.149938 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:02:55.156679 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.156654 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-cni-bin\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.156780 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.156694 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-ovn-node-metrics-cert\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.156780 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.156727 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l296h\" (UniqueName: \"kubernetes.io/projected/ab0cc5c6-ec31-4ed7-a4fa-9968260fc075-kube-api-access-l296h\") pod \"node-ca-q8z42\" (UID: \"ab0cc5c6-ec31-4ed7-a4fa-9968260fc075\") " pod="openshift-image-registry/node-ca-q8z42" Apr 16 18:02:55.156780 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.156756 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-registration-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.156920 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.156781 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-run-openvswitch\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.156920 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.156824 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:02:55.156920 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.156827 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-cni-bin\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.156920 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.156869 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-run-openvswitch\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.157096 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.156926 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-run-netns\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.157096 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.157007 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:55.157096 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157032 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-sys\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.157287 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.157116 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs podName:47436224-f5c2-417d-bfdc-61bfb9b8939a nodeName:}" failed. No retries permitted until 2026-04-16 18:02:55.657069645 +0000 UTC m=+2.027743550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs") pod "network-metrics-daemon-2xjws" (UID: "47436224-f5c2-417d-bfdc-61bfb9b8939a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:55.157287 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157108 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:02:55.157287 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157130 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-run-netns\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.157287 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157168 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-var-lib-kubelet\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.157287 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157207 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-sys\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.157287 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157239 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-daemon-config\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.157287 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157261 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-run-multus-certs\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157298 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-run-multus-certs\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157304 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-var-lib-kubelet\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157326 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pn77j\" (UniqueName: \"kubernetes.io/projected/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-kube-api-access-pn77j\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157367 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6d32ccfb-d1d4-4972-9eae-bd16e530ce56-konnectivity-ca\") pod \"konnectivity-agent-xgk8q\" (UID: \"6d32ccfb-d1d4-4972-9eae-bd16e530ce56\") " pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157394 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-socket-dir-parent\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157417 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-env-overrides\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157440 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-modprobe-d\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157459 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d775120d-67fa-413e-b39f-64f7ae31665d-etc-tuned\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157482 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f6f788f-55c3-4a55-8674-eb821c1fc81e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157502 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-system-cni-dir\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157523 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-run-k8s-cni-cncf-io\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157546 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-etc-openvswitch\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157567 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab0cc5c6-ec31-4ed7-a4fa-9968260fc075-host\") pod \"node-ca-q8z42\" (UID: \"ab0cc5c6-ec31-4ed7-a4fa-9968260fc075\") " pod="openshift-image-registry/node-ca-q8z42" Apr 16 18:02:55.157595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157580 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-modprobe-d\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157653 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-slash\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157586 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-slash\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157718 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4f6f788f-55c3-4a55-8674-eb821c1fc81e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157749 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-node-log\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157775 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knlb4\" (UniqueName: \"kubernetes.io/projected/d775120d-67fa-413e-b39f-64f7ae31665d-kube-api-access-knlb4\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157803 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sr9z\" (UniqueName: \"kubernetes.io/projected/4f6f788f-55c3-4a55-8674-eb821c1fc81e-kube-api-access-9sr9z\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157829 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-hostroot\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157842 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-daemon-config\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157862 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab0cc5c6-ec31-4ed7-a4fa-9968260fc075-serviceca\") pod \"node-ca-q8z42\" (UID: \"ab0cc5c6-ec31-4ed7-a4fa-9968260fc075\") " pod="openshift-image-registry/node-ca-q8z42" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157894 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-kubelet\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157907 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-env-overrides\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157920 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-host\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157958 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-os-release\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.157992 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-sys-fs\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158020 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdm6z\" (UniqueName: \"kubernetes.io/projected/06425cb3-b57a-4b2a-80fc-58a19bb8366e-kube-api-access-wdm6z\") pod \"iptables-alerter-q97cg\" (UID: \"06425cb3-b57a-4b2a-80fc-58a19bb8366e\") " pod="openshift-network-operator/iptables-alerter-q97cg" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158027 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-node-log\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.158261 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158049 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-run-ovn\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158053 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-run-k8s-cni-cncf-io\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158075 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-ovnkube-config\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158099 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-hostroot\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158145 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-kubernetes\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158102 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-kubernetes\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158159 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-socket-dir-parent\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158185 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-systemd\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158212 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-conf-dir\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158216 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-os-release\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158239 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrcx2\" (UniqueName: \"kubernetes.io/projected/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-kube-api-access-hrcx2\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158291 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-systemd\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158333 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-conf-dir\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158390 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-lib-modules\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158419 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-os-release\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158447 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-systemd-units\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158473 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-cni-netd\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158499 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-sysctl-conf\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.159055 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158505 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4f6f788f-55c3-4a55-8674-eb821c1fc81e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158556 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-cni-netd\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158556 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-os-release\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158579 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-run\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158625 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d775120d-67fa-413e-b39f-64f7ae31665d-tmp\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158641 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-sysctl-conf\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158653 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158680 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-cni-binary-copy\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158709 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06425cb3-b57a-4b2a-80fc-58a19bb8366e-iptables-alerter-script\") pod \"iptables-alerter-q97cg\" (UID: \"06425cb3-b57a-4b2a-80fc-58a19bb8366e\") " pod="openshift-network-operator/iptables-alerter-q97cg" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158748 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-run-systemd\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158773 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-sysconfig\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158798 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-cnibin\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158826 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-device-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158840 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-ovnkube-config\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158854 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06425cb3-b57a-4b2a-80fc-58a19bb8366e-host-slash\") pod \"iptables-alerter-q97cg\" (UID: \"06425cb3-b57a-4b2a-80fc-58a19bb8366e\") " pod="openshift-network-operator/iptables-alerter-q97cg" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158894 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-kubelet\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158900 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-log-socket\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158907 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06425cb3-b57a-4b2a-80fc-58a19bb8366e-host-slash\") pod \"iptables-alerter-q97cg\" (UID: \"06425cb3-b57a-4b2a-80fc-58a19bb8366e\") " pod="openshift-network-operator/iptables-alerter-q97cg" Apr 16 18:02:55.159948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158939 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-log-socket\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158940 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158961 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-run\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158974 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-var-lib-kubelet\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158994 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f6f788f-55c3-4a55-8674-eb821c1fc81e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.158498 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-run-ovn\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159002 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-system-cni-dir\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159049 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-systemd-units\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159058 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-run-netns\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159086 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6d32ccfb-d1d4-4972-9eae-bd16e530ce56-agent-certs\") pod \"konnectivity-agent-xgk8q\" (UID: \"6d32ccfb-d1d4-4972-9eae-bd16e530ce56\") " pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159098 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-var-lib-kubelet\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159115 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-ovnkube-script-lib\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159136 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159145 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f6f788f-55c3-4a55-8674-eb821c1fc81e-cni-binary-copy\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159177 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-cnibin\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159186 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-system-cni-dir\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159204 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.161895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159229 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-etc-selinux\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159246 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-sysconfig\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159261 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qr4q\" (UniqueName: \"kubernetes.io/projected/47436224-f5c2-417d-bfdc-61bfb9b8939a-kube-api-access-7qr4q\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159289 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-sysctl-d\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159292 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-run-systemd\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159318 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-var-lib-cni-bin\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159340 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-etc-openvswitch\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159345 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-var-lib-cni-multus\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159375 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-socket-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159421 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-var-lib-openvswitch\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159431 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-lib-modules\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159450 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-cni-dir\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159476 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-etc-kubernetes\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159503 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk775\" (UniqueName: \"kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775\") pod \"network-check-target-xddxf\" (UID: \"fff8e593-e247-4d63-b01b-84beef483739\") " pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159515 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06425cb3-b57a-4b2a-80fc-58a19bb8366e-iptables-alerter-script\") pod \"iptables-alerter-q97cg\" (UID: \"06425cb3-b57a-4b2a-80fc-58a19bb8366e\") " pod="openshift-network-operator/iptables-alerter-q97cg" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159545 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mzlb\" (UniqueName: \"kubernetes.io/projected/99d0082a-e1ec-41dc-a5de-2c8445917ba6-kube-api-access-4mzlb\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159574 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.162382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.159761 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-cnibin\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160017 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160023 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-cni-binary-copy\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160089 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-var-lib-openvswitch\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160202 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-etc-sysctl-d\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160206 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-multus-cni-dir\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160246 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-var-lib-cni-bin\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160261 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-etc-kubernetes\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160288 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-var-lib-cni-multus\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160661 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f6f788f-55c3-4a55-8674-eb821c1fc81e-system-cni-dir\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160738 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-cnibin\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160890 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-ovnkube-script-lib\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.160955 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-host-run-netns\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.161140 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.161172 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f6f788f-55c3-4a55-8674-eb821c1fc81e-cni-binary-copy\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.161467 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d775120d-67fa-413e-b39f-64f7ae31665d-host\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.161673 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-ovn-node-metrics-cert\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.161924 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d775120d-67fa-413e-b39f-64f7ae31665d-etc-tuned\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.162858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.162149 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d775120d-67fa-413e-b39f-64f7ae31665d-tmp\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.172372 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.172313 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qr4q\" (UniqueName: \"kubernetes.io/projected/47436224-f5c2-417d-bfdc-61bfb9b8939a-kube-api-access-7qr4q\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:02:55.172372 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.172326 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn77j\" (UniqueName: \"kubernetes.io/projected/2bcf7a84-59f0-42fe-970c-4db3c9cf9a52-kube-api-access-pn77j\") pod \"multus-hwxs7\" (UID: \"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52\") " pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.172531 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.172494 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdm6z\" (UniqueName: \"kubernetes.io/projected/06425cb3-b57a-4b2a-80fc-58a19bb8366e-kube-api-access-wdm6z\") pod \"iptables-alerter-q97cg\" (UID: \"06425cb3-b57a-4b2a-80fc-58a19bb8366e\") " pod="openshift-network-operator/iptables-alerter-q97cg" Apr 16 18:02:55.174389 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.174360 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sr9z\" (UniqueName: \"kubernetes.io/projected/4f6f788f-55c3-4a55-8674-eb821c1fc81e-kube-api-access-9sr9z\") pod \"multus-additional-cni-plugins-lf54m\" (UID: \"4f6f788f-55c3-4a55-8674-eb821c1fc81e\") " pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.174488 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.174416 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrcx2\" (UniqueName: \"kubernetes.io/projected/8aff675d-044c-4bf6-94c2-f3b57d9c5b7a-kube-api-access-hrcx2\") pod \"ovnkube-node-gxh6d\" (UID: \"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.174578 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.174561 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knlb4\" (UniqueName: \"kubernetes.io/projected/d775120d-67fa-413e-b39f-64f7ae31665d-kube-api-access-knlb4\") pod \"tuned-jlhzz\" (UID: \"d775120d-67fa-413e-b39f-64f7ae31665d\") " pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.228945 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.228914 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:55.262244 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262179 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l296h\" (UniqueName: \"kubernetes.io/projected/ab0cc5c6-ec31-4ed7-a4fa-9968260fc075-kube-api-access-l296h\") pod \"node-ca-q8z42\" (UID: \"ab0cc5c6-ec31-4ed7-a4fa-9968260fc075\") " pod="openshift-image-registry/node-ca-q8z42" Apr 16 18:02:55.262244 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262212 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-registration-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262244 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262245 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6d32ccfb-d1d4-4972-9eae-bd16e530ce56-konnectivity-ca\") pod \"konnectivity-agent-xgk8q\" (UID: \"6d32ccfb-d1d4-4972-9eae-bd16e530ce56\") " pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:02:55.262480 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262272 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab0cc5c6-ec31-4ed7-a4fa-9968260fc075-host\") pod \"node-ca-q8z42\" (UID: \"ab0cc5c6-ec31-4ed7-a4fa-9968260fc075\") " pod="openshift-image-registry/node-ca-q8z42" Apr 16 18:02:55.262480 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262300 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab0cc5c6-ec31-4ed7-a4fa-9968260fc075-serviceca\") pod \"node-ca-q8z42\" (UID: \"ab0cc5c6-ec31-4ed7-a4fa-9968260fc075\") " pod="openshift-image-registry/node-ca-q8z42" Apr 16 18:02:55.262480 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262327 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-sys-fs\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262480 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262328 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-registration-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262480 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262343 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab0cc5c6-ec31-4ed7-a4fa-9968260fc075-host\") pod \"node-ca-q8z42\" (UID: \"ab0cc5c6-ec31-4ed7-a4fa-9968260fc075\") " pod="openshift-image-registry/node-ca-q8z42" Apr 16 18:02:55.262480 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262401 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-sys-fs\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262480 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262440 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-device-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262480 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262473 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6d32ccfb-d1d4-4972-9eae-bd16e530ce56-agent-certs\") pod \"konnectivity-agent-xgk8q\" (UID: \"6d32ccfb-d1d4-4972-9eae-bd16e530ce56\") " pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:02:55.262480 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262477 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-device-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262905 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262500 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262905 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262525 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-etc-selinux\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262905 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262552 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-socket-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262905 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262580 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk775\" (UniqueName: \"kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775\") pod \"network-check-target-xddxf\" (UID: \"fff8e593-e247-4d63-b01b-84beef483739\") " pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:02:55.262905 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262600 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262905 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262622 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-etc-selinux\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262905 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262606 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mzlb\" (UniqueName: \"kubernetes.io/projected/99d0082a-e1ec-41dc-a5de-2c8445917ba6-kube-api-access-4mzlb\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.262905 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262721 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/99d0082a-e1ec-41dc-a5de-2c8445917ba6-socket-dir\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.263227 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262925 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab0cc5c6-ec31-4ed7-a4fa-9968260fc075-serviceca\") pod \"node-ca-q8z42\" (UID: \"ab0cc5c6-ec31-4ed7-a4fa-9968260fc075\") " pod="openshift-image-registry/node-ca-q8z42" Apr 16 18:02:55.263227 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.262940 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6d32ccfb-d1d4-4972-9eae-bd16e530ce56-konnectivity-ca\") pod \"konnectivity-agent-xgk8q\" (UID: \"6d32ccfb-d1d4-4972-9eae-bd16e530ce56\") " pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:02:55.264935 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.264899 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6d32ccfb-d1d4-4972-9eae-bd16e530ce56-agent-certs\") pod \"konnectivity-agent-xgk8q\" (UID: \"6d32ccfb-d1d4-4972-9eae-bd16e530ce56\") " pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:02:55.276417 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:55.276383 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c150e8aee5bd9091901cfeb87f90560.slice/crio-a37f2f6f7749151d98fbff0dca5a7186682846b80cf08280c538631cb94606f8 WatchSource:0}: Error finding container a37f2f6f7749151d98fbff0dca5a7186682846b80cf08280c538631cb94606f8: Status 404 returned error can't find the container with id a37f2f6f7749151d98fbff0dca5a7186682846b80cf08280c538631cb94606f8 Apr 16 18:02:55.276659 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:55.276644 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3369dae444aea437eceaaa1a41477d3d.slice/crio-3bfc4798b561ca810ee8f81ec9b9687e1fe6a86aa516b73a255bcd278ff75e8d WatchSource:0}: Error finding container 3bfc4798b561ca810ee8f81ec9b9687e1fe6a86aa516b73a255bcd278ff75e8d: Status 404 returned error can't find the container with id 3bfc4798b561ca810ee8f81ec9b9687e1fe6a86aa516b73a255bcd278ff75e8d Apr 16 18:02:55.280310 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.280296 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:02:55.285150 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.285131 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l296h\" (UniqueName: \"kubernetes.io/projected/ab0cc5c6-ec31-4ed7-a4fa-9968260fc075-kube-api-access-l296h\") pod \"node-ca-q8z42\" (UID: \"ab0cc5c6-ec31-4ed7-a4fa-9968260fc075\") " pod="openshift-image-registry/node-ca-q8z42" Apr 16 18:02:55.285323 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.285294 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mzlb\" (UniqueName: \"kubernetes.io/projected/99d0082a-e1ec-41dc-a5de-2c8445917ba6-kube-api-access-4mzlb\") pod \"aws-ebs-csi-driver-node-fd6s8\" (UID: \"99d0082a-e1ec-41dc-a5de-2c8445917ba6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.285396 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.285343 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:55.285396 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.285363 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:55.285396 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.285377 2564 projected.go:194] Error preparing data for projected volume kube-api-access-pk775 for pod openshift-network-diagnostics/network-check-target-xddxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:55.285519 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.285451 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775 podName:fff8e593-e247-4d63-b01b-84beef483739 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:55.785431491 +0000 UTC m=+2.156105393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pk775" (UniqueName: "kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775") pod "network-check-target-xddxf" (UID: "fff8e593-e247-4d63-b01b-84beef483739") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:55.367731 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.367703 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" Apr 16 18:02:55.374065 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:55.374040 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd775120d_67fa_413e_b39f_64f7ae31665d.slice/crio-eab04f34b4997055cab4ccc705eff0bd67eb6a712bac54987ec1edbc93e8e2f1 WatchSource:0}: Error finding container eab04f34b4997055cab4ccc705eff0bd67eb6a712bac54987ec1edbc93e8e2f1: Status 404 returned error can't find the container with id eab04f34b4997055cab4ccc705eff0bd67eb6a712bac54987ec1edbc93e8e2f1 Apr 16 18:02:55.377937 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.377919 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lf54m" Apr 16 18:02:55.383712 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:55.383689 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f6f788f_55c3_4a55_8674_eb821c1fc81e.slice/crio-fe28983c0a9abb03a745e8254eb41df84f4f043c55aa21189d14fea9335d01bb WatchSource:0}: Error finding container fe28983c0a9abb03a745e8254eb41df84f4f043c55aa21189d14fea9335d01bb: Status 404 returned error can't find the container with id fe28983c0a9abb03a745e8254eb41df84f4f043c55aa21189d14fea9335d01bb Apr 16 18:02:55.398623 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.398592 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:55.399429 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.399412 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hwxs7" Apr 16 18:02:55.405105 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:55.405077 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bcf7a84_59f0_42fe_970c_4db3c9cf9a52.slice/crio-c83e6c27748e0d4519d1b2ea6ed69858e4c13fbd0977b756224b99ccf2b49521 WatchSource:0}: Error finding container c83e6c27748e0d4519d1b2ea6ed69858e4c13fbd0977b756224b99ccf2b49521: Status 404 returned error can't find the container with id c83e6c27748e0d4519d1b2ea6ed69858e4c13fbd0977b756224b99ccf2b49521 Apr 16 18:02:55.418598 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.418576 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-q97cg" Apr 16 18:02:55.424662 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:55.424637 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06425cb3_b57a_4b2a_80fc_58a19bb8366e.slice/crio-c3291f342c407e5dcaefb345cf833420b6cc95a50e43f390d38a6adec3257cba WatchSource:0}: Error finding container c3291f342c407e5dcaefb345cf833420b6cc95a50e43f390d38a6adec3257cba: Status 404 returned error can't find the container with id c3291f342c407e5dcaefb345cf833420b6cc95a50e43f390d38a6adec3257cba Apr 16 18:02:55.437240 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.437221 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:02:55.443122 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:55.443099 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aff675d_044c_4bf6_94c2_f3b57d9c5b7a.slice/crio-5dd6472cf0cb264e97cdaf505b7899a2df50e0b885a215873a0482a420dd96c0 WatchSource:0}: Error finding container 5dd6472cf0cb264e97cdaf505b7899a2df50e0b885a215873a0482a420dd96c0: Status 404 returned error can't find the container with id 5dd6472cf0cb264e97cdaf505b7899a2df50e0b885a215873a0482a420dd96c0 Apr 16 18:02:55.456782 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.456759 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:02:55.464410 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:55.464387 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d32ccfb_d1d4_4972_9eae_bd16e530ce56.slice/crio-1a15164bae829ff59f7a1eb8ca2a38ef17dc75c2631b05da4b802ffca8c48f9d WatchSource:0}: Error finding container 1a15164bae829ff59f7a1eb8ca2a38ef17dc75c2631b05da4b802ffca8c48f9d: Status 404 returned error can't find the container with id 1a15164bae829ff59f7a1eb8ca2a38ef17dc75c2631b05da4b802ffca8c48f9d Apr 16 18:02:55.474375 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.474360 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q8z42" Apr 16 18:02:55.477909 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.477890 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" Apr 16 18:02:55.480540 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:55.480515 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab0cc5c6_ec31_4ed7_a4fa_9968260fc075.slice/crio-68a9337b04d7e8086b18ab7a07f7829a5f2ae7ac8c56c24bdda6e389ff73a417 WatchSource:0}: Error finding container 68a9337b04d7e8086b18ab7a07f7829a5f2ae7ac8c56c24bdda6e389ff73a417: Status 404 returned error can't find the container with id 68a9337b04d7e8086b18ab7a07f7829a5f2ae7ac8c56c24bdda6e389ff73a417 Apr 16 18:02:55.484254 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:02:55.484231 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99d0082a_e1ec_41dc_a5de_2c8445917ba6.slice/crio-1fec463ff958a8024d4e06e180d68b5f6263825aa33da69ae3fd81ddc900c7dc WatchSource:0}: Error finding container 1fec463ff958a8024d4e06e180d68b5f6263825aa33da69ae3fd81ddc900c7dc: Status 404 returned error can't find the container with id 1fec463ff958a8024d4e06e180d68b5f6263825aa33da69ae3fd81ddc900c7dc Apr 16 18:02:55.665052 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.664958 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:02:55.665250 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.665228 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:55.665337 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.665315 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs podName:47436224-f5c2-417d-bfdc-61bfb9b8939a nodeName:}" failed. No retries permitted until 2026-04-16 18:02:56.665294856 +0000 UTC m=+3.035968745 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs") pod "network-metrics-daemon-2xjws" (UID: "47436224-f5c2-417d-bfdc-61bfb9b8939a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:55.866744 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:55.866704 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk775\" (UniqueName: \"kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775\") pod \"network-check-target-xddxf\" (UID: \"fff8e593-e247-4d63-b01b-84beef483739\") " pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:02:55.866931 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.866915 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:55.867032 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.866935 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:55.867032 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.866948 2564 projected.go:194] Error preparing data for projected volume kube-api-access-pk775 for pod openshift-network-diagnostics/network-check-target-xddxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:55.867032 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:55.867009 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775 podName:fff8e593-e247-4d63-b01b-84beef483739 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:56.866990282 +0000 UTC m=+3.237664181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pk775" (UniqueName: "kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775") pod "network-check-target-xddxf" (UID: "fff8e593-e247-4d63-b01b-84beef483739") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:56.110582 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.110490 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:55 +0000 UTC" deadline="2027-12-19 23:07:41.082211637 +0000 UTC" Apr 16 18:02:56.110582 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.110535 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14693h4m44.971681048s" Apr 16 18:02:56.190964 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.190462 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:02:56.190964 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:56.190596 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:02:56.204292 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.204235 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" event={"ID":"3369dae444aea437eceaaa1a41477d3d","Type":"ContainerStarted","Data":"3bfc4798b561ca810ee8f81ec9b9687e1fe6a86aa516b73a255bcd278ff75e8d"} Apr 16 18:02:56.209933 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.209902 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xgk8q" event={"ID":"6d32ccfb-d1d4-4972-9eae-bd16e530ce56","Type":"ContainerStarted","Data":"1a15164bae829ff59f7a1eb8ca2a38ef17dc75c2631b05da4b802ffca8c48f9d"} Apr 16 18:02:56.228450 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.228383 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" event={"ID":"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a","Type":"ContainerStarted","Data":"5dd6472cf0cb264e97cdaf505b7899a2df50e0b885a215873a0482a420dd96c0"} Apr 16 18:02:56.238971 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.238939 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hwxs7" event={"ID":"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52","Type":"ContainerStarted","Data":"c83e6c27748e0d4519d1b2ea6ed69858e4c13fbd0977b756224b99ccf2b49521"} Apr 16 18:02:56.258308 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.258275 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" event={"ID":"d775120d-67fa-413e-b39f-64f7ae31665d","Type":"ContainerStarted","Data":"eab04f34b4997055cab4ccc705eff0bd67eb6a712bac54987ec1edbc93e8e2f1"} Apr 16 18:02:56.260747 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.260057 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-193.ec2.internal" event={"ID":"5c150e8aee5bd9091901cfeb87f90560","Type":"ContainerStarted","Data":"a37f2f6f7749151d98fbff0dca5a7186682846b80cf08280c538631cb94606f8"} Apr 16 18:02:56.265003 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.261711 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" event={"ID":"99d0082a-e1ec-41dc-a5de-2c8445917ba6","Type":"ContainerStarted","Data":"1fec463ff958a8024d4e06e180d68b5f6263825aa33da69ae3fd81ddc900c7dc"} Apr 16 18:02:56.267753 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.267726 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q8z42" event={"ID":"ab0cc5c6-ec31-4ed7-a4fa-9968260fc075","Type":"ContainerStarted","Data":"68a9337b04d7e8086b18ab7a07f7829a5f2ae7ac8c56c24bdda6e389ff73a417"} Apr 16 18:02:56.273256 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.273229 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-q97cg" event={"ID":"06425cb3-b57a-4b2a-80fc-58a19bb8366e","Type":"ContainerStarted","Data":"c3291f342c407e5dcaefb345cf833420b6cc95a50e43f390d38a6adec3257cba"} Apr 16 18:02:56.289252 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.288904 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lf54m" event={"ID":"4f6f788f-55c3-4a55-8674-eb821c1fc81e","Type":"ContainerStarted","Data":"fe28983c0a9abb03a745e8254eb41df84f4f043c55aa21189d14fea9335d01bb"} Apr 16 18:02:56.325673 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.325644 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:56.674340 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.673755 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:02:56.674340 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:56.673912 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:56.674340 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:56.673974 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs podName:47436224-f5c2-417d-bfdc-61bfb9b8939a nodeName:}" failed. No retries permitted until 2026-04-16 18:02:58.673956766 +0000 UTC m=+5.044630666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs") pod "network-metrics-daemon-2xjws" (UID: "47436224-f5c2-417d-bfdc-61bfb9b8939a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:56.876456 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:56.875793 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk775\" (UniqueName: \"kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775\") pod \"network-check-target-xddxf\" (UID: \"fff8e593-e247-4d63-b01b-84beef483739\") " pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:02:56.876456 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:56.875995 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:56.876456 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:56.876014 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:56.876456 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:56.876028 2564 projected.go:194] Error preparing data for projected volume kube-api-access-pk775 for pod openshift-network-diagnostics/network-check-target-xddxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:56.876456 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:56.876086 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775 podName:fff8e593-e247-4d63-b01b-84beef483739 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:58.876068426 +0000 UTC m=+5.246742315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pk775" (UniqueName: "kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775") pod "network-check-target-xddxf" (UID: "fff8e593-e247-4d63-b01b-84beef483739") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:57.111516 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:57.111356 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:55 +0000 UTC" deadline="2027-11-04 09:56:54.582785509 +0000 UTC" Apr 16 18:02:57.111516 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:57.111400 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13599h53m57.471389589s" Apr 16 18:02:57.188846 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:57.188809 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:02:57.189026 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:57.188937 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:02:58.188573 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:58.188029 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:02:58.188573 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:58.188167 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:02:58.692294 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:58.692257 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:02:58.692487 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:58.692428 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:58.692549 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:58.692490 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs podName:47436224-f5c2-417d-bfdc-61bfb9b8939a nodeName:}" failed. No retries permitted until 2026-04-16 18:03:02.692472253 +0000 UTC m=+9.063146156 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs") pod "network-metrics-daemon-2xjws" (UID: "47436224-f5c2-417d-bfdc-61bfb9b8939a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:58.893889 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:58.893851 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk775\" (UniqueName: \"kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775\") pod \"network-check-target-xddxf\" (UID: \"fff8e593-e247-4d63-b01b-84beef483739\") " pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:02:58.894067 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:58.894037 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:58.894067 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:58.894056 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:58.894180 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:58.894069 2564 projected.go:194] Error preparing data for projected volume kube-api-access-pk775 for pod openshift-network-diagnostics/network-check-target-xddxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:58.894180 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:58.894125 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775 podName:fff8e593-e247-4d63-b01b-84beef483739 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:02.894107148 +0000 UTC m=+9.264781046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-pk775" (UniqueName: "kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775") pod "network-check-target-xddxf" (UID: "fff8e593-e247-4d63-b01b-84beef483739") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:59.189152 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:02:59.188942 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:02:59.189152 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:02:59.189079 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:00.189001 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:00.188967 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:00.189167 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:00.189108 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:01.188241 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:01.188208 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:01.188415 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:01.188329 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:02.189179 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:02.188829 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:02.189179 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:02.188984 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:02.728508 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:02.728472 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:02.728690 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:02.728642 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:02.728762 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:02.728709 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs podName:47436224-f5c2-417d-bfdc-61bfb9b8939a nodeName:}" failed. No retries permitted until 2026-04-16 18:03:10.728692616 +0000 UTC m=+17.099366502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs") pod "network-metrics-daemon-2xjws" (UID: "47436224-f5c2-417d-bfdc-61bfb9b8939a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:02.930142 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:02.930110 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk775\" (UniqueName: \"kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775\") pod \"network-check-target-xddxf\" (UID: \"fff8e593-e247-4d63-b01b-84beef483739\") " pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:02.930322 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:02.930267 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:03:02.930322 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:02.930289 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:03:02.930322 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:02.930302 2564 projected.go:194] Error preparing data for projected volume kube-api-access-pk775 for pod openshift-network-diagnostics/network-check-target-xddxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:02.930490 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:02.930367 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775 podName:fff8e593-e247-4d63-b01b-84beef483739 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:10.930347259 +0000 UTC m=+17.301021169 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-pk775" (UniqueName: "kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775") pod "network-check-target-xddxf" (UID: "fff8e593-e247-4d63-b01b-84beef483739") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:03.188741 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:03.188660 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:03.188919 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:03.188794 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:04.188895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:04.188862 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:04.189282 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:04.189007 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:05.187908 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:05.187870 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:05.188097 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:05.188000 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:06.188260 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:06.188223 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:06.188696 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:06.188364 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:07.188240 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:07.188205 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:07.188408 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:07.188314 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:08.188488 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:08.188454 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:08.188934 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:08.188592 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:09.187868 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:09.187838 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:09.188034 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:09.187963 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:10.188778 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:10.188739 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:10.189195 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:10.188864 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:10.785920 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:10.785881 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:10.786076 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:10.786023 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:10.786121 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:10.786089 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs podName:47436224-f5c2-417d-bfdc-61bfb9b8939a nodeName:}" failed. No retries permitted until 2026-04-16 18:03:26.786071478 +0000 UTC m=+33.156745368 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs") pod "network-metrics-daemon-2xjws" (UID: "47436224-f5c2-417d-bfdc-61bfb9b8939a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:10.987533 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:10.987492 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk775\" (UniqueName: \"kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775\") pod \"network-check-target-xddxf\" (UID: \"fff8e593-e247-4d63-b01b-84beef483739\") " pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:10.987761 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:10.987678 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:03:10.987761 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:10.987706 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:03:10.987761 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:10.987722 2564 projected.go:194] Error preparing data for projected volume kube-api-access-pk775 for pod openshift-network-diagnostics/network-check-target-xddxf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:10.987920 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:10.987785 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775 podName:fff8e593-e247-4d63-b01b-84beef483739 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:26.9877661 +0000 UTC m=+33.358439986 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-pk775" (UniqueName: "kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775") pod "network-check-target-xddxf" (UID: "fff8e593-e247-4d63-b01b-84beef483739") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:11.188134 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:11.188043 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:11.188292 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:11.188181 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:12.188748 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:12.188708 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:12.189133 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:12.188876 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:13.188296 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.188136 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:13.188385 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:13.188368 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:13.324788 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.324765 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:03:13.325420 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.325158 2564 generic.go:358] "Generic (PLEG): container finished" podID="8aff675d-044c-4bf6-94c2-f3b57d9c5b7a" containerID="c5251d497cf0d10959965b823e1cd8255d6bf917ec617ffa91044bde28fe4219" exitCode=1 Apr 16 18:03:13.325420 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.325231 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" event={"ID":"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a","Type":"ContainerStarted","Data":"e1a439ed03a91792de2f7bd2cf967e534da3e79032780e947b6f11879e86c876"} Apr 16 18:03:13.325420 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.325271 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" event={"ID":"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a","Type":"ContainerDied","Data":"c5251d497cf0d10959965b823e1cd8255d6bf917ec617ffa91044bde28fe4219"} Apr 16 18:03:13.325420 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.325290 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" event={"ID":"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a","Type":"ContainerStarted","Data":"b634aaadae14991613049826082d08e16386da052e223191f41f4c27bd309324"} Apr 16 18:03:13.326806 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.326783 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hwxs7" event={"ID":"2bcf7a84-59f0-42fe-970c-4db3c9cf9a52","Type":"ContainerStarted","Data":"fe85ead0d031e4319e782a5836c96189791d46ff937b8bc02363d98283a821c2"} Apr 16 18:03:13.328307 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.328265 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" event={"ID":"d775120d-67fa-413e-b39f-64f7ae31665d","Type":"ContainerStarted","Data":"56d2734896915094fc89f1ea378cf093f1ba07b454d8711a8f6a14be2bfbc27f"} Apr 16 18:03:13.329991 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.329954 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-193.ec2.internal" event={"ID":"5c150e8aee5bd9091901cfeb87f90560","Type":"ContainerStarted","Data":"e98952766fb1ebc5368a023b856e14e94abf3baa8c6ab1ecdee259870853ff9f"} Apr 16 18:03:13.350050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.349988 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hwxs7" podStartSLOduration=1.973089768 podStartE2EDuration="19.349971284s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:02:55.406692525 +0000 UTC m=+1.777366410" lastFinishedPulling="2026-04-16 18:03:12.783574025 +0000 UTC m=+19.154247926" observedRunningTime="2026-04-16 18:03:13.3497118 +0000 UTC m=+19.720385718" watchObservedRunningTime="2026-04-16 18:03:13.349971284 +0000 UTC m=+19.720645193" Apr 16 18:03:13.395006 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.394863 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-193.ec2.internal" podStartSLOduration=18.394847208 podStartE2EDuration="18.394847208s" podCreationTimestamp="2026-04-16 18:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:03:13.369209403 +0000 UTC m=+19.739883311" watchObservedRunningTime="2026-04-16 18:03:13.394847208 +0000 UTC m=+19.765521092" Apr 16 18:03:13.395222 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:13.395184 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jlhzz" podStartSLOduration=2.024371228 podStartE2EDuration="19.395174872s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:02:55.375526591 +0000 UTC m=+1.746200476" lastFinishedPulling="2026-04-16 18:03:12.746330233 +0000 UTC m=+19.117004120" observedRunningTime="2026-04-16 18:03:13.39500166 +0000 UTC m=+19.765675568" watchObservedRunningTime="2026-04-16 18:03:13.395174872 +0000 UTC m=+19.765848779" Apr 16 18:03:14.189862 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.189594 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:14.190016 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:14.189918 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:14.333077 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.333044 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" event={"ID":"99d0082a-e1ec-41dc-a5de-2c8445917ba6","Type":"ContainerStarted","Data":"d4a01d7c7084986e85768f573c01ff60797ad0fc9fd3cee5bb052b58ff08cc72"} Apr 16 18:03:14.334466 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.334429 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q8z42" event={"ID":"ab0cc5c6-ec31-4ed7-a4fa-9968260fc075","Type":"ContainerStarted","Data":"75127e6ac1c801fcc5bd0ba5f4a0b5736354a6b78de31d8d8a7b59e6b01ab843"} Apr 16 18:03:14.335922 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.335887 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-q97cg" event={"ID":"06425cb3-b57a-4b2a-80fc-58a19bb8366e","Type":"ContainerStarted","Data":"6cb766a0272302c98fc8b5bc3f79b688b770d7a4d503408b6e480afcfb44b4c7"} Apr 16 18:03:14.337464 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.337439 2564 generic.go:358] "Generic (PLEG): container finished" podID="4f6f788f-55c3-4a55-8674-eb821c1fc81e" containerID="8e782bfcb80945d41ed9df6ba1bf39457d41987a3ad80b1e58e99f797cc7fd29" exitCode=0 Apr 16 18:03:14.337549 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.337524 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lf54m" event={"ID":"4f6f788f-55c3-4a55-8674-eb821c1fc81e","Type":"ContainerDied","Data":"8e782bfcb80945d41ed9df6ba1bf39457d41987a3ad80b1e58e99f797cc7fd29"} Apr 16 18:03:14.338980 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.338960 2564 generic.go:358] "Generic (PLEG): container finished" podID="3369dae444aea437eceaaa1a41477d3d" containerID="7f8e56e56e85a1f79c2f55d82299361499fb8043a1ee9d863664f516c8581e6e" exitCode=0 Apr 16 18:03:14.339098 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.339072 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" event={"ID":"3369dae444aea437eceaaa1a41477d3d","Type":"ContainerDied","Data":"7f8e56e56e85a1f79c2f55d82299361499fb8043a1ee9d863664f516c8581e6e"} Apr 16 18:03:14.340590 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.340556 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xgk8q" event={"ID":"6d32ccfb-d1d4-4972-9eae-bd16e530ce56","Type":"ContainerStarted","Data":"de1f3e6f60dff327ea5c578b47f768966cc6ce3e18e2013543b6e3d2324c6058"} Apr 16 18:03:14.343169 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.343150 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:03:14.343591 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.343568 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" event={"ID":"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a","Type":"ContainerStarted","Data":"0044400f05b9ca27ac50513e0ce2464dcd74ca33fe13abcb243a63323562caea"} Apr 16 18:03:14.343671 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.343600 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" event={"ID":"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a","Type":"ContainerStarted","Data":"f35db0daa5138e59ffacbfc08f1b79106ad9dd4844fd6b56905d452cd786faf0"} Apr 16 18:03:14.343671 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.343631 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" event={"ID":"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a","Type":"ContainerStarted","Data":"5cf5b58fbf320309c8204b4db596bb39e4ce12aff9e0baafc1dad6c6d493fcb0"} Apr 16 18:03:14.359098 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.359062 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q8z42" podStartSLOduration=3.094932108 podStartE2EDuration="20.359051949s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:02:55.48194493 +0000 UTC m=+1.852618816" lastFinishedPulling="2026-04-16 18:03:12.746064758 +0000 UTC m=+19.116738657" observedRunningTime="2026-04-16 18:03:14.358507156 +0000 UTC m=+20.729181064" watchObservedRunningTime="2026-04-16 18:03:14.359051949 +0000 UTC m=+20.729725856" Apr 16 18:03:14.433755 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.433706 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-q97cg" podStartSLOduration=3.076708401 podStartE2EDuration="20.433688795s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:02:55.426139007 +0000 UTC m=+1.796812892" lastFinishedPulling="2026-04-16 18:03:12.783119393 +0000 UTC m=+19.153793286" observedRunningTime="2026-04-16 18:03:14.412035171 +0000 UTC m=+20.782709080" watchObservedRunningTime="2026-04-16 18:03:14.433688795 +0000 UTC m=+20.804362706" Apr 16 18:03:14.433937 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.433912 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xgk8q" podStartSLOduration=3.153738528 podStartE2EDuration="20.433908275s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:02:55.465891177 +0000 UTC m=+1.836565076" lastFinishedPulling="2026-04-16 18:03:12.746060925 +0000 UTC m=+19.116734823" observedRunningTime="2026-04-16 18:03:14.43343262 +0000 UTC m=+20.804106527" watchObservedRunningTime="2026-04-16 18:03:14.433908275 +0000 UTC m=+20.804582181" Apr 16 18:03:14.604283 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:14.604263 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:03:15.123725 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:15.123596 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:03:14.604278405Z","UUID":"6a3c59cf-df00-4ac3-a22e-97d581e38c7b","Handler":null,"Name":"","Endpoint":""} Apr 16 18:03:15.126086 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:15.126063 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:03:15.126208 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:15.126095 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:03:15.188529 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:15.188492 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:15.188715 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:15.188636 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:15.347827 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:15.347792 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" event={"ID":"3369dae444aea437eceaaa1a41477d3d","Type":"ContainerStarted","Data":"aa4e824b6b60764debbb3778d6d5de2916c7a7b6c0ade2ca483c3f299a5e7f3f"} Apr 16 18:03:15.349992 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:15.349943 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" event={"ID":"99d0082a-e1ec-41dc-a5de-2c8445917ba6","Type":"ContainerStarted","Data":"3e041e1e5bebaee186f1a4aa8e5be370973567a1c6f2efcf9597eecc80891163"} Apr 16 18:03:15.364948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:15.364899 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-193.ec2.internal" podStartSLOduration=20.364886025 podStartE2EDuration="20.364886025s" podCreationTimestamp="2026-04-16 18:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:03:15.364559465 +0000 UTC m=+21.735233373" watchObservedRunningTime="2026-04-16 18:03:15.364886025 +0000 UTC m=+21.735559931" Apr 16 18:03:16.188891 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:16.188643 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:16.189168 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:16.189008 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:16.354355 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:16.354315 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" event={"ID":"99d0082a-e1ec-41dc-a5de-2c8445917ba6","Type":"ContainerStarted","Data":"9fa5669b55fe00d4670e2da8ddcb52609df5063935fff814c213ccf053a25172"} Apr 16 18:03:16.357469 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:16.357450 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:03:16.357890 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:16.357863 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" event={"ID":"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a","Type":"ContainerStarted","Data":"f521e5a10af332ed253271ca8c5d3802b71c25afd022925f4c754192832ef13a"} Apr 16 18:03:16.375401 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:16.375359 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fd6s8" podStartSLOduration=2.438558014 podStartE2EDuration="22.375346601s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:02:55.485699267 +0000 UTC m=+1.856373153" lastFinishedPulling="2026-04-16 18:03:15.422487847 +0000 UTC m=+21.793161740" observedRunningTime="2026-04-16 18:03:16.375227173 +0000 UTC m=+22.745901081" watchObservedRunningTime="2026-04-16 18:03:16.375346601 +0000 UTC m=+22.746020509" Apr 16 18:03:17.099704 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:17.099668 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:03:17.100505 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:17.100487 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:03:17.188727 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:17.188689 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:17.188892 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:17.188826 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:17.360347 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:17.360266 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:03:17.360740 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:17.360604 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xgk8q" Apr 16 18:03:18.188587 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:18.188555 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:18.188774 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:18.188716 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:18.365919 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:18.365900 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:03:19.188685 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:19.188645 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:19.188868 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:19.188760 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:19.368881 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:19.368851 2564 generic.go:358] "Generic (PLEG): container finished" podID="4f6f788f-55c3-4a55-8674-eb821c1fc81e" containerID="6b7feaf4ce80082bf59a8c2880c2d09f5a146765fcc8c016e7233af26144df8e" exitCode=0 Apr 16 18:03:19.369285 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:19.368921 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lf54m" event={"ID":"4f6f788f-55c3-4a55-8674-eb821c1fc81e","Type":"ContainerDied","Data":"6b7feaf4ce80082bf59a8c2880c2d09f5a146765fcc8c016e7233af26144df8e"} Apr 16 18:03:19.371773 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:19.371753 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:03:19.372124 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:19.372099 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" event={"ID":"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a","Type":"ContainerStarted","Data":"2470fbc402992c3c1e1a1d8354ce75627705941eaaff07fbb0b1347640fff20f"} Apr 16 18:03:19.372598 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:19.372586 2564 scope.go:117] "RemoveContainer" containerID="c5251d497cf0d10959965b823e1cd8255d6bf917ec617ffa91044bde28fe4219" Apr 16 18:03:20.188312 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.188279 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:20.188474 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:20.188431 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:20.269024 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.268998 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xddxf"] Apr 16 18:03:20.269144 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.269130 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:20.269264 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:20.269238 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:20.271653 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.271626 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2xjws"] Apr 16 18:03:20.377394 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.377319 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:03:20.377798 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.377716 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" event={"ID":"8aff675d-044c-4bf6-94c2-f3b57d9c5b7a","Type":"ContainerStarted","Data":"28e3bb55e6af83f76aecdc2ed029c1c7d2aa3bc3addb75f6d7f6b2bdd5293ae2"} Apr 16 18:03:20.378171 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.378134 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:03:20.378171 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.378173 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:03:20.378350 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.378187 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:03:20.379821 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.379798 2564 generic.go:358] "Generic (PLEG): container finished" podID="4f6f788f-55c3-4a55-8674-eb821c1fc81e" containerID="f33a3d33a5fba583a2ff0baba093b2a37cd073d8efd51656464ed206c12bbcfd" exitCode=0 Apr 16 18:03:20.379915 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.379845 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lf54m" event={"ID":"4f6f788f-55c3-4a55-8674-eb821c1fc81e","Type":"ContainerDied","Data":"f33a3d33a5fba583a2ff0baba093b2a37cd073d8efd51656464ed206c12bbcfd"} Apr 16 18:03:20.379915 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.379871 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:20.380018 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:20.379962 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:20.393464 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.393306 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:03:20.393559 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.393511 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:03:20.407335 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:20.407297 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" podStartSLOduration=8.767946026 podStartE2EDuration="26.40728497s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:02:55.44463766 +0000 UTC m=+1.815311545" lastFinishedPulling="2026-04-16 18:03:13.083976602 +0000 UTC m=+19.454650489" observedRunningTime="2026-04-16 18:03:20.40599973 +0000 UTC m=+26.776673637" watchObservedRunningTime="2026-04-16 18:03:20.40728497 +0000 UTC m=+26.777958877" Apr 16 18:03:21.380237 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.380202 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lk9pc"] Apr 16 18:03:21.383110 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.383082 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lk9pc" Apr 16 18:03:21.384020 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.383993 2564 generic.go:358] "Generic (PLEG): container finished" podID="4f6f788f-55c3-4a55-8674-eb821c1fc81e" containerID="ac9c672b7ca219ab0db2ee34b4a94c3553133c87319c51705b1f684b24a05ad3" exitCode=0 Apr 16 18:03:21.384130 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.384064 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lf54m" event={"ID":"4f6f788f-55c3-4a55-8674-eb821c1fc81e","Type":"ContainerDied","Data":"ac9c672b7ca219ab0db2ee34b4a94c3553133c87319c51705b1f684b24a05ad3"} Apr 16 18:03:21.385917 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.385901 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6g7d8\"" Apr 16 18:03:21.386423 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.386403 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:03:21.386486 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.386406 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:03:21.460437 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.460399 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee7d5ac7-7dde-40a2-97ed-71acf303ea5d-hosts-file\") pod \"node-resolver-lk9pc\" (UID: \"ee7d5ac7-7dde-40a2-97ed-71acf303ea5d\") " pod="openshift-dns/node-resolver-lk9pc" Apr 16 18:03:21.460437 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.460433 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktvlc\" (UniqueName: \"kubernetes.io/projected/ee7d5ac7-7dde-40a2-97ed-71acf303ea5d-kube-api-access-ktvlc\") pod \"node-resolver-lk9pc\" (UID: \"ee7d5ac7-7dde-40a2-97ed-71acf303ea5d\") " pod="openshift-dns/node-resolver-lk9pc" Apr 16 18:03:21.460705 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.460573 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee7d5ac7-7dde-40a2-97ed-71acf303ea5d-tmp-dir\") pod \"node-resolver-lk9pc\" (UID: \"ee7d5ac7-7dde-40a2-97ed-71acf303ea5d\") " pod="openshift-dns/node-resolver-lk9pc" Apr 16 18:03:21.561936 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.561904 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee7d5ac7-7dde-40a2-97ed-71acf303ea5d-hosts-file\") pod \"node-resolver-lk9pc\" (UID: \"ee7d5ac7-7dde-40a2-97ed-71acf303ea5d\") " pod="openshift-dns/node-resolver-lk9pc" Apr 16 18:03:21.561936 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.561938 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktvlc\" (UniqueName: \"kubernetes.io/projected/ee7d5ac7-7dde-40a2-97ed-71acf303ea5d-kube-api-access-ktvlc\") pod \"node-resolver-lk9pc\" (UID: \"ee7d5ac7-7dde-40a2-97ed-71acf303ea5d\") " pod="openshift-dns/node-resolver-lk9pc" Apr 16 18:03:21.562155 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.561960 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee7d5ac7-7dde-40a2-97ed-71acf303ea5d-tmp-dir\") pod \"node-resolver-lk9pc\" (UID: \"ee7d5ac7-7dde-40a2-97ed-71acf303ea5d\") " pod="openshift-dns/node-resolver-lk9pc" Apr 16 18:03:21.562155 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.562028 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee7d5ac7-7dde-40a2-97ed-71acf303ea5d-hosts-file\") pod \"node-resolver-lk9pc\" (UID: \"ee7d5ac7-7dde-40a2-97ed-71acf303ea5d\") " pod="openshift-dns/node-resolver-lk9pc" Apr 16 18:03:21.562242 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.562196 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee7d5ac7-7dde-40a2-97ed-71acf303ea5d-tmp-dir\") pod \"node-resolver-lk9pc\" (UID: \"ee7d5ac7-7dde-40a2-97ed-71acf303ea5d\") " pod="openshift-dns/node-resolver-lk9pc" Apr 16 18:03:21.571446 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.571424 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktvlc\" (UniqueName: \"kubernetes.io/projected/ee7d5ac7-7dde-40a2-97ed-71acf303ea5d-kube-api-access-ktvlc\") pod \"node-resolver-lk9pc\" (UID: \"ee7d5ac7-7dde-40a2-97ed-71acf303ea5d\") " pod="openshift-dns/node-resolver-lk9pc" Apr 16 18:03:21.692498 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:21.692463 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lk9pc" Apr 16 18:03:21.703928 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:03:21.703895 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee7d5ac7_7dde_40a2_97ed_71acf303ea5d.slice/crio-ed95baadf0e694f850ba52f14d564b20d936c673bce465a3c4b0f1ca36e63ff5 WatchSource:0}: Error finding container ed95baadf0e694f850ba52f14d564b20d936c673bce465a3c4b0f1ca36e63ff5: Status 404 returned error can't find the container with id ed95baadf0e694f850ba52f14d564b20d936c673bce465a3c4b0f1ca36e63ff5 Apr 16 18:03:22.192656 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:22.192628 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:22.192812 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:22.192628 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:22.192812 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:22.192751 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:22.192917 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:22.192847 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:22.388268 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:22.388231 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lk9pc" event={"ID":"ee7d5ac7-7dde-40a2-97ed-71acf303ea5d","Type":"ContainerStarted","Data":"2e75f271c6a8af13629c511768bb4933b30e61bdf97fe5c8d4c4de4b51167905"} Apr 16 18:03:22.389194 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:22.388286 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lk9pc" event={"ID":"ee7d5ac7-7dde-40a2-97ed-71acf303ea5d","Type":"ContainerStarted","Data":"ed95baadf0e694f850ba52f14d564b20d936c673bce465a3c4b0f1ca36e63ff5"} Apr 16 18:03:22.408183 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:22.408140 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lk9pc" podStartSLOduration=1.408123522 podStartE2EDuration="1.408123522s" podCreationTimestamp="2026-04-16 18:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:03:22.407588793 +0000 UTC m=+28.778262700" watchObservedRunningTime="2026-04-16 18:03:22.408123522 +0000 UTC m=+28.778797430" Apr 16 18:03:24.189868 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:24.189579 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:24.190259 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:24.189677 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:24.190259 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:24.189919 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xddxf" podUID="fff8e593-e247-4d63-b01b-84beef483739" Apr 16 18:03:24.190259 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:24.190024 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2xjws" podUID="47436224-f5c2-417d-bfdc-61bfb9b8939a" Apr 16 18:03:25.969603 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:25.969575 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-193.ec2.internal" event="NodeReady" Apr 16 18:03:25.970029 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:25.969741 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:03:26.041727 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.041688 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d459db89c-vh5fb"] Apr 16 18:03:26.080859 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.080821 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-s8vvh"] Apr 16 18:03:26.081028 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.080990 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.088352 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.088325 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:03:26.088760 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.088744 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:03:26.088878 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.088863 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dnqxl\"" Apr 16 18:03:26.090149 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.090134 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:03:26.110046 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.109310 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s8vvh"] Apr 16 18:03:26.110046 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.109346 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d459db89c-vh5fb"] Apr 16 18:03:26.110046 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.109477 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8vvh" Apr 16 18:03:26.110803 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.110704 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:03:26.116642 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.116601 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:03:26.178115 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.178075 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2t59p"] Apr 16 18:03:26.195755 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.195724 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b47d4c2-caca-46dd-b761-e865026dbdb9-ca-trust-extracted\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.195911 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.195768 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-trusted-ca\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.195911 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.195847 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-installation-pull-secrets\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.195911 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.195901 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c81b1e66-a5cc-48eb-b28a-83101de54fcf-kubelet-config\") pod \"global-pull-secret-syncer-s8vvh\" (UID: \"c81b1e66-a5cc-48eb-b28a-83101de54fcf\") " pod="kube-system/global-pull-secret-syncer-s8vvh" Apr 16 18:03:26.196058 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.195933 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-bound-sa-token\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.196058 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.195992 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-image-registry-private-configuration\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.196058 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.196034 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c81b1e66-a5cc-48eb-b28a-83101de54fcf-original-pull-secret\") pod \"global-pull-secret-syncer-s8vvh\" (UID: \"c81b1e66-a5cc-48eb-b28a-83101de54fcf\") " pod="kube-system/global-pull-secret-syncer-s8vvh" Apr 16 18:03:26.196172 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.196095 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.196172 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.196112 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-certificates\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.196172 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.196129 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b92kq\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-kube-api-access-b92kq\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.196276 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.196192 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c81b1e66-a5cc-48eb-b28a-83101de54fcf-dbus\") pod \"global-pull-secret-syncer-s8vvh\" (UID: \"c81b1e66-a5cc-48eb-b28a-83101de54fcf\") " pod="kube-system/global-pull-secret-syncer-s8vvh" Apr 16 18:03:26.207728 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.207693 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2htvb"] Apr 16 18:03:26.207868 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.207854 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.211091 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.211066 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:03:26.211218 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.211101 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zzqks\"" Apr 16 18:03:26.211346 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.211327 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:03:26.237383 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.237308 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:26.240272 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.240040 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:03:26.240272 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.240069 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fkw8n\"" Apr 16 18:03:26.240272 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.240077 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:03:26.240272 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.240048 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:03:26.240947 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.240928 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2htvb"] Apr 16 18:03:26.240947 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.240949 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2t59p"] Apr 16 18:03:26.241115 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.241064 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:26.241115 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.241082 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:26.243682 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.243656 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:03:26.243682 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.243663 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:03:26.243823 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.243703 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:03:26.243823 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.243743 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z2hb6\"" Apr 16 18:03:26.244051 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.244038 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qtxn\"" Apr 16 18:03:26.297444 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297411 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.297640 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297452 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-certificates\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.297640 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297476 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b92kq\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-kube-api-access-b92kq\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.297640 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297504 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjtqc\" (UniqueName: \"kubernetes.io/projected/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-kube-api-access-bjtqc\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.297640 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297537 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-tmp-dir\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.297640 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297566 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c81b1e66-a5cc-48eb-b28a-83101de54fcf-dbus\") pod \"global-pull-secret-syncer-s8vvh\" (UID: \"c81b1e66-a5cc-48eb-b28a-83101de54fcf\") " pod="kube-system/global-pull-secret-syncer-s8vvh" Apr 16 18:03:26.297640 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.297585 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:26.297640 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.297623 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d459db89c-vh5fb: secret "image-registry-tls" not found Apr 16 18:03:26.297640 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297593 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-config-volume\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.298050 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.297690 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls podName:5b47d4c2-caca-46dd-b761-e865026dbdb9 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:26.797666022 +0000 UTC m=+33.168339918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls") pod "image-registry-5d459db89c-vh5fb" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9") : secret "image-registry-tls" not found Apr 16 18:03:26.298050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297749 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b47d4c2-caca-46dd-b761-e865026dbdb9-ca-trust-extracted\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.298050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297788 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-trusted-ca\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.298050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297821 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-installation-pull-secrets\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.298050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297855 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c81b1e66-a5cc-48eb-b28a-83101de54fcf-kubelet-config\") pod \"global-pull-secret-syncer-s8vvh\" (UID: \"c81b1e66-a5cc-48eb-b28a-83101de54fcf\") " pod="kube-system/global-pull-secret-syncer-s8vvh" Apr 16 18:03:26.298050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297885 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-bound-sa-token\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.298050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297905 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c81b1e66-a5cc-48eb-b28a-83101de54fcf-dbus\") pod \"global-pull-secret-syncer-s8vvh\" (UID: \"c81b1e66-a5cc-48eb-b28a-83101de54fcf\") " pod="kube-system/global-pull-secret-syncer-s8vvh" Apr 16 18:03:26.298050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297916 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-image-registry-private-configuration\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.298050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.297962 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c81b1e66-a5cc-48eb-b28a-83101de54fcf-original-pull-secret\") pod \"global-pull-secret-syncer-s8vvh\" (UID: \"c81b1e66-a5cc-48eb-b28a-83101de54fcf\") " pod="kube-system/global-pull-secret-syncer-s8vvh" Apr 16 18:03:26.298050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.298005 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.298526 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.298154 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b47d4c2-caca-46dd-b761-e865026dbdb9-ca-trust-extracted\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.298526 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.298188 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-certificates\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.298526 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.298264 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c81b1e66-a5cc-48eb-b28a-83101de54fcf-kubelet-config\") pod \"global-pull-secret-syncer-s8vvh\" (UID: \"c81b1e66-a5cc-48eb-b28a-83101de54fcf\") " pod="kube-system/global-pull-secret-syncer-s8vvh" Apr 16 18:03:26.298861 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.298820 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-trusted-ca\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.301837 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.301799 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c81b1e66-a5cc-48eb-b28a-83101de54fcf-original-pull-secret\") pod \"global-pull-secret-syncer-s8vvh\" (UID: \"c81b1e66-a5cc-48eb-b28a-83101de54fcf\") " pod="kube-system/global-pull-secret-syncer-s8vvh" Apr 16 18:03:26.301932 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.301844 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-installation-pull-secrets\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.301932 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.301861 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-image-registry-private-configuration\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.308321 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.308294 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-bound-sa-token\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.308415 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.308322 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b92kq\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-kube-api-access-b92kq\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.398622 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.398570 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.398795 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.398656 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjtqc\" (UniqueName: \"kubernetes.io/projected/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-kube-api-access-bjtqc\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.398795 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.398686 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m9kf\" (UniqueName: \"kubernetes.io/projected/5a9c1147-51d3-4765-a356-5a32f7bc0b67-kube-api-access-4m9kf\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:26.398795 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.398719 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-tmp-dir\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.398795 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.398742 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:26.398795 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.398770 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-config-volume\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.399368 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.399161 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:26.399368 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.399240 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls podName:8abd6ace-6d1c-4823-ba9a-f0f8e17c654c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:26.899220575 +0000 UTC m=+33.269894464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls") pod "dns-default-2t59p" (UID: "8abd6ace-6d1c-4823-ba9a-f0f8e17c654c") : secret "dns-default-metrics-tls" not found Apr 16 18:03:26.399368 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.399325 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-config-volume\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.399597 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.399387 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-tmp-dir\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.410217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.410183 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjtqc\" (UniqueName: \"kubernetes.io/projected/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-kube-api-access-bjtqc\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.419954 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.419927 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8vvh" Apr 16 18:03:26.500199 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.500116 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m9kf\" (UniqueName: \"kubernetes.io/projected/5a9c1147-51d3-4765-a356-5a32f7bc0b67-kube-api-access-4m9kf\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:26.500199 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.500156 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:26.500407 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.500248 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:26.500407 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.500298 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert podName:5a9c1147-51d3-4765-a356-5a32f7bc0b67 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:27.000284944 +0000 UTC m=+33.370958828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert") pod "ingress-canary-2htvb" (UID: "5a9c1147-51d3-4765-a356-5a32f7bc0b67") : secret "canary-serving-cert" not found Apr 16 18:03:26.510046 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.510014 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m9kf\" (UniqueName: \"kubernetes.io/projected/5a9c1147-51d3-4765-a356-5a32f7bc0b67-kube-api-access-4m9kf\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:26.803147 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.803068 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:26.803147 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.803112 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:26.803353 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.803251 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:03:26.803353 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.803315 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:26.803353 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.803331 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d459db89c-vh5fb: secret "image-registry-tls" not found Apr 16 18:03:26.803353 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.803334 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs podName:47436224-f5c2-417d-bfdc-61bfb9b8939a nodeName:}" failed. No retries permitted until 2026-04-16 18:03:58.803313567 +0000 UTC m=+65.173987466 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs") pod "network-metrics-daemon-2xjws" (UID: "47436224-f5c2-417d-bfdc-61bfb9b8939a") : secret "metrics-daemon-secret" not found Apr 16 18:03:26.803510 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.803380 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls podName:5b47d4c2-caca-46dd-b761-e865026dbdb9 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:27.803363606 +0000 UTC m=+34.174037494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls") pod "image-registry-5d459db89c-vh5fb" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9") : secret "image-registry-tls" not found Apr 16 18:03:26.904177 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:26.904145 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:26.904340 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.904253 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:26.904388 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:26.904337 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls podName:8abd6ace-6d1c-4823-ba9a-f0f8e17c654c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:27.904308308 +0000 UTC m=+34.274982199 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls") pod "dns-default-2t59p" (UID: "8abd6ace-6d1c-4823-ba9a-f0f8e17c654c") : secret "dns-default-metrics-tls" not found Apr 16 18:03:27.005505 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:27.005403 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk775\" (UniqueName: \"kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775\") pod \"network-check-target-xddxf\" (UID: \"fff8e593-e247-4d63-b01b-84beef483739\") " pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:27.005888 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:27.005529 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:27.005888 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:27.005681 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:27.005888 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:27.005752 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert podName:5a9c1147-51d3-4765-a356-5a32f7bc0b67 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:28.005732313 +0000 UTC m=+34.376406202 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert") pod "ingress-canary-2htvb" (UID: "5a9c1147-51d3-4765-a356-5a32f7bc0b67") : secret "canary-serving-cert" not found Apr 16 18:03:27.008895 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:27.008871 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk775\" (UniqueName: \"kubernetes.io/projected/fff8e593-e247-4d63-b01b-84beef483739-kube-api-access-pk775\") pod \"network-check-target-xddxf\" (UID: \"fff8e593-e247-4d63-b01b-84beef483739\") " pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:27.084227 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:27.084156 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s8vvh"] Apr 16 18:03:27.088622 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:03:27.088582 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc81b1e66_a5cc_48eb_b28a_83101de54fcf.slice/crio-27a7e412ac47119ceb6f12c4c6975a51bae685a9fd02ab9e0237e8daa010b93a WatchSource:0}: Error finding container 27a7e412ac47119ceb6f12c4c6975a51bae685a9fd02ab9e0237e8daa010b93a: Status 404 returned error can't find the container with id 27a7e412ac47119ceb6f12c4c6975a51bae685a9fd02ab9e0237e8daa010b93a Apr 16 18:03:27.156202 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:27.156171 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:27.403464 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:27.403413 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s8vvh" event={"ID":"c81b1e66-a5cc-48eb-b28a-83101de54fcf","Type":"ContainerStarted","Data":"27a7e412ac47119ceb6f12c4c6975a51bae685a9fd02ab9e0237e8daa010b93a"} Apr 16 18:03:27.404141 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:27.404117 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xddxf"] Apr 16 18:03:27.406478 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:03:27.406450 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff8e593_e247_4d63_b01b_84beef483739.slice/crio-0816d5d39e163e70fef9a45d4e7fb0c9f9498c3e5efd6e764c149d51bc242524 WatchSource:0}: Error finding container 0816d5d39e163e70fef9a45d4e7fb0c9f9498c3e5efd6e764c149d51bc242524: Status 404 returned error can't find the container with id 0816d5d39e163e70fef9a45d4e7fb0c9f9498c3e5efd6e764c149d51bc242524 Apr 16 18:03:27.812554 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:27.812350 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:27.812776 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:27.812525 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:27.812776 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:27.812632 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d459db89c-vh5fb: secret "image-registry-tls" not found Apr 16 18:03:27.812776 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:27.812696 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls podName:5b47d4c2-caca-46dd-b761-e865026dbdb9 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:29.812680824 +0000 UTC m=+36.183354710 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls") pod "image-registry-5d459db89c-vh5fb" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9") : secret "image-registry-tls" not found Apr 16 18:03:27.913726 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:27.913628 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:27.913902 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:27.913766 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:27.913902 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:27.913835 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls podName:8abd6ace-6d1c-4823-ba9a-f0f8e17c654c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:29.913815073 +0000 UTC m=+36.284488958 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls") pod "dns-default-2t59p" (UID: "8abd6ace-6d1c-4823-ba9a-f0f8e17c654c") : secret "dns-default-metrics-tls" not found Apr 16 18:03:28.014950 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.014909 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:28.015391 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:28.015084 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:28.015391 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:28.015161 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert podName:5a9c1147-51d3-4765-a356-5a32f7bc0b67 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:30.015140673 +0000 UTC m=+36.385814564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert") pod "ingress-canary-2htvb" (UID: "5a9c1147-51d3-4765-a356-5a32f7bc0b67") : secret "canary-serving-cert" not found Apr 16 18:03:28.408655 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.408602 2564 generic.go:358] "Generic (PLEG): container finished" podID="4f6f788f-55c3-4a55-8674-eb821c1fc81e" containerID="c94dc49c8c3b7e7b29305f09a639f8bc51e49869c7f7cd1b1e67f2fcdd2ae64d" exitCode=0 Apr 16 18:03:28.408831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.408660 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lf54m" event={"ID":"4f6f788f-55c3-4a55-8674-eb821c1fc81e","Type":"ContainerDied","Data":"c94dc49c8c3b7e7b29305f09a639f8bc51e49869c7f7cd1b1e67f2fcdd2ae64d"} Apr 16 18:03:28.410361 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.410333 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xddxf" event={"ID":"fff8e593-e247-4d63-b01b-84beef483739","Type":"ContainerStarted","Data":"0816d5d39e163e70fef9a45d4e7fb0c9f9498c3e5efd6e764c149d51bc242524"} Apr 16 18:03:28.789435 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.789363 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm"] Apr 16 18:03:28.810301 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.810072 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm"] Apr 16 18:03:28.810301 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.810204 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:03:28.813011 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.812598 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 18:03:28.813011 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.812734 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 18:03:28.813011 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.812742 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tm7vs\"" Apr 16 18:03:28.923051 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.922872 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:03:28.923051 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:28.922925 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4aea7302-70a3-484c-bdf8-4a745df33639-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:03:29.023794 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:29.023759 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:03:29.023794 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:29.023794 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4aea7302-70a3-484c-bdf8-4a745df33639-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:03:29.024347 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:29.023936 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:03:29.024347 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:29.024012 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert podName:4aea7302-70a3-484c-bdf8-4a745df33639 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:29.52399585 +0000 UTC m=+35.894669739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-pffzm" (UID: "4aea7302-70a3-484c-bdf8-4a745df33639") : secret "networking-console-plugin-cert" not found Apr 16 18:03:29.024521 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:29.024498 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4aea7302-70a3-484c-bdf8-4a745df33639-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:03:29.415632 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:29.415588 2564 generic.go:358] "Generic (PLEG): container finished" podID="4f6f788f-55c3-4a55-8674-eb821c1fc81e" containerID="86308f70fbb503758f4fcb70b5123db1f29d95871aaa65d724fba5588c6a0233" exitCode=0 Apr 16 18:03:29.415789 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:29.415650 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lf54m" event={"ID":"4f6f788f-55c3-4a55-8674-eb821c1fc81e","Type":"ContainerDied","Data":"86308f70fbb503758f4fcb70b5123db1f29d95871aaa65d724fba5588c6a0233"} Apr 16 18:03:29.528499 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:29.528457 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:03:29.528687 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:29.528623 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:03:29.528764 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:29.528694 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert podName:4aea7302-70a3-484c-bdf8-4a745df33639 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:30.528673477 +0000 UTC m=+36.899347361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-pffzm" (UID: "4aea7302-70a3-484c-bdf8-4a745df33639") : secret "networking-console-plugin-cert" not found Apr 16 18:03:29.831088 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:29.831050 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:29.831249 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:29.831202 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:29.831249 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:29.831224 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d459db89c-vh5fb: secret "image-registry-tls" not found Apr 16 18:03:29.831364 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:29.831294 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls podName:5b47d4c2-caca-46dd-b761-e865026dbdb9 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:33.831271696 +0000 UTC m=+40.201945589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls") pod "image-registry-5d459db89c-vh5fb" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9") : secret "image-registry-tls" not found Apr 16 18:03:29.931945 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:29.931906 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:29.932108 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:29.932074 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:29.932178 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:29.932166 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls podName:8abd6ace-6d1c-4823-ba9a-f0f8e17c654c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:33.932146924 +0000 UTC m=+40.302820810 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls") pod "dns-default-2t59p" (UID: "8abd6ace-6d1c-4823-ba9a-f0f8e17c654c") : secret "dns-default-metrics-tls" not found Apr 16 18:03:30.033312 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:30.033264 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:30.033857 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:30.033438 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:30.033857 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:30.033519 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert podName:5a9c1147-51d3-4765-a356-5a32f7bc0b67 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:34.033498638 +0000 UTC m=+40.404172596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert") pod "ingress-canary-2htvb" (UID: "5a9c1147-51d3-4765-a356-5a32f7bc0b67") : secret "canary-serving-cert" not found Apr 16 18:03:30.537130 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:30.537087 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:03:30.537303 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:30.537261 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:03:30.537349 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:30.537342 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert podName:4aea7302-70a3-484c-bdf8-4a745df33639 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:32.537320898 +0000 UTC m=+38.907994788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-pffzm" (UID: "4aea7302-70a3-484c-bdf8-4a745df33639") : secret "networking-console-plugin-cert" not found Apr 16 18:03:32.424438 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:32.424413 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lf54m" event={"ID":"4f6f788f-55c3-4a55-8674-eb821c1fc81e","Type":"ContainerStarted","Data":"2ffa919badb840761942ae4ca756ca8cd533e9f9fb8d22237c5679627035fbf8"} Apr 16 18:03:32.425757 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:32.425735 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xddxf" event={"ID":"fff8e593-e247-4d63-b01b-84beef483739","Type":"ContainerStarted","Data":"08474d867fdafe002efedf4d2f76065b06db68d14ec04606167037277f2d9077"} Apr 16 18:03:32.552814 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:32.552724 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:03:32.552955 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:32.552877 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:03:32.552955 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:32.552943 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert podName:4aea7302-70a3-484c-bdf8-4a745df33639 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:36.552927314 +0000 UTC m=+42.923601198 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-pffzm" (UID: "4aea7302-70a3-484c-bdf8-4a745df33639") : secret "networking-console-plugin-cert" not found Apr 16 18:03:33.428971 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:33.428933 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s8vvh" event={"ID":"c81b1e66-a5cc-48eb-b28a-83101de54fcf","Type":"ContainerStarted","Data":"6ade6e5a0e38d344ec70d64675b5d1c9b7be7806bb8efd3a4fca9fc205236f3c"} Apr 16 18:03:33.462670 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:33.462600 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xddxf" podStartSLOduration=34.598029827 podStartE2EDuration="39.462586547s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:03:27.40889218 +0000 UTC m=+33.779566082" lastFinishedPulling="2026-04-16 18:03:32.273448915 +0000 UTC m=+38.644122802" observedRunningTime="2026-04-16 18:03:33.46213004 +0000 UTC m=+39.832803949" watchObservedRunningTime="2026-04-16 18:03:33.462586547 +0000 UTC m=+39.833260454" Apr 16 18:03:33.463113 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:33.463083 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-s8vvh" podStartSLOduration=2.268533088 podStartE2EDuration="7.463075245s" podCreationTimestamp="2026-04-16 18:03:26 +0000 UTC" firstStartedPulling="2026-04-16 18:03:27.090188539 +0000 UTC m=+33.460862423" lastFinishedPulling="2026-04-16 18:03:32.284730694 +0000 UTC m=+38.655404580" observedRunningTime="2026-04-16 18:03:33.445940223 +0000 UTC m=+39.816614131" watchObservedRunningTime="2026-04-16 18:03:33.463075245 +0000 UTC m=+39.833749152" Apr 16 18:03:33.490902 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:33.490842 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lf54m" podStartSLOduration=7.570295786 podStartE2EDuration="39.4908251s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:02:55.38505949 +0000 UTC m=+1.755733388" lastFinishedPulling="2026-04-16 18:03:27.305588813 +0000 UTC m=+33.676262702" observedRunningTime="2026-04-16 18:03:33.48911302 +0000 UTC m=+39.859786931" watchObservedRunningTime="2026-04-16 18:03:33.4908251 +0000 UTC m=+39.861499007" Apr 16 18:03:33.863587 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:33.863546 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:33.863775 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:33.863703 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:33.863775 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:33.863721 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d459db89c-vh5fb: secret "image-registry-tls" not found Apr 16 18:03:33.863853 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:33.863781 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls podName:5b47d4c2-caca-46dd-b761-e865026dbdb9 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:41.863765853 +0000 UTC m=+48.234439738 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls") pod "image-registry-5d459db89c-vh5fb" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9") : secret "image-registry-tls" not found Apr 16 18:03:33.964687 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:33.964651 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:33.964820 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:33.964792 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:33.964858 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:33.964852 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls podName:8abd6ace-6d1c-4823-ba9a-f0f8e17c654c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:41.964836142 +0000 UTC m=+48.335510027 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls") pod "dns-default-2t59p" (UID: "8abd6ace-6d1c-4823-ba9a-f0f8e17c654c") : secret "dns-default-metrics-tls" not found Apr 16 18:03:34.065762 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:34.065725 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:34.065898 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:34.065859 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:34.065939 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:34.065918 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert podName:5a9c1147-51d3-4765-a356-5a32f7bc0b67 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:42.065902615 +0000 UTC m=+48.436576500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert") pod "ingress-canary-2htvb" (UID: "5a9c1147-51d3-4765-a356-5a32f7bc0b67") : secret "canary-serving-cert" not found Apr 16 18:03:36.581208 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:36.581172 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:03:36.581575 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:36.581313 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:03:36.581575 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:36.581378 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert podName:4aea7302-70a3-484c-bdf8-4a745df33639 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:44.581363163 +0000 UTC m=+50.952037053 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-pffzm" (UID: "4aea7302-70a3-484c-bdf8-4a745df33639") : secret "networking-console-plugin-cert" not found Apr 16 18:03:39.088258 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.088218 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw"] Apr 16 18:03:39.137995 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.137963 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw"] Apr 16 18:03:39.138140 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.138082 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw" Apr 16 18:03:39.142392 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.142343 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kpbgw\"" Apr 16 18:03:39.142595 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.142582 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:03:39.143384 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.143365 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:03:39.301031 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.300991 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7r7c\" (UniqueName: \"kubernetes.io/projected/7b731140-2291-4c78-96fd-b58bafe82dc6-kube-api-access-j7r7c\") pod \"migrator-64d4d94569-t72vw\" (UID: \"7b731140-2291-4c78-96fd-b58bafe82dc6\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw" Apr 16 18:03:39.401491 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.401401 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7r7c\" (UniqueName: \"kubernetes.io/projected/7b731140-2291-4c78-96fd-b58bafe82dc6-kube-api-access-j7r7c\") pod \"migrator-64d4d94569-t72vw\" (UID: \"7b731140-2291-4c78-96fd-b58bafe82dc6\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw" Apr 16 18:03:39.411348 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.411312 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7r7c\" (UniqueName: \"kubernetes.io/projected/7b731140-2291-4c78-96fd-b58bafe82dc6-kube-api-access-j7r7c\") pod \"migrator-64d4d94569-t72vw\" (UID: \"7b731140-2291-4c78-96fd-b58bafe82dc6\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw" Apr 16 18:03:39.446684 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.446654 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw" Apr 16 18:03:39.559808 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.559778 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw"] Apr 16 18:03:39.560640 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.560529 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lk9pc_ee7d5ac7-7dde-40a2-97ed-71acf303ea5d/dns-node-resolver/0.log" Apr 16 18:03:39.572395 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:03:39.572361 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b731140_2291_4c78_96fd_b58bafe82dc6.slice/crio-d058f9608f98196d3ddf655b1bc3a0c5834a33b0d6466abbcf140703a4966486 WatchSource:0}: Error finding container d058f9608f98196d3ddf655b1bc3a0c5834a33b0d6466abbcf140703a4966486: Status 404 returned error can't find the container with id d058f9608f98196d3ddf655b1bc3a0c5834a33b0d6466abbcf140703a4966486 Apr 16 18:03:39.754129 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.754097 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-wgb6h"] Apr 16 18:03:39.801137 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.801109 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-wgb6h"] Apr 16 18:03:39.801296 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.801232 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" Apr 16 18:03:39.804243 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.804221 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:03:39.804390 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.804282 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-nb6c2\"" Apr 16 18:03:39.804579 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.804561 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:03:39.805417 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.805399 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:03:39.805502 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.805461 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:03:39.904117 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.904079 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/660169ed-81f1-4fe6-bec7-728c78cf4f99-signing-key\") pod \"service-ca-bfc587fb7-wgb6h\" (UID: \"660169ed-81f1-4fe6-bec7-728c78cf4f99\") " pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" Apr 16 18:03:39.904117 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.904118 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wtpn\" (UniqueName: \"kubernetes.io/projected/660169ed-81f1-4fe6-bec7-728c78cf4f99-kube-api-access-9wtpn\") pod \"service-ca-bfc587fb7-wgb6h\" (UID: \"660169ed-81f1-4fe6-bec7-728c78cf4f99\") " pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" Apr 16 18:03:39.904333 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:39.904154 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/660169ed-81f1-4fe6-bec7-728c78cf4f99-signing-cabundle\") pod \"service-ca-bfc587fb7-wgb6h\" (UID: \"660169ed-81f1-4fe6-bec7-728c78cf4f99\") " pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" Apr 16 18:03:40.005220 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:40.005137 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/660169ed-81f1-4fe6-bec7-728c78cf4f99-signing-cabundle\") pod \"service-ca-bfc587fb7-wgb6h\" (UID: \"660169ed-81f1-4fe6-bec7-728c78cf4f99\") " pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" Apr 16 18:03:40.005366 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:40.005258 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/660169ed-81f1-4fe6-bec7-728c78cf4f99-signing-key\") pod \"service-ca-bfc587fb7-wgb6h\" (UID: \"660169ed-81f1-4fe6-bec7-728c78cf4f99\") " pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" Apr 16 18:03:40.005366 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:40.005283 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wtpn\" (UniqueName: \"kubernetes.io/projected/660169ed-81f1-4fe6-bec7-728c78cf4f99-kube-api-access-9wtpn\") pod \"service-ca-bfc587fb7-wgb6h\" (UID: \"660169ed-81f1-4fe6-bec7-728c78cf4f99\") " pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" Apr 16 18:03:40.005899 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:40.005880 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/660169ed-81f1-4fe6-bec7-728c78cf4f99-signing-cabundle\") pod \"service-ca-bfc587fb7-wgb6h\" (UID: \"660169ed-81f1-4fe6-bec7-728c78cf4f99\") " pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" Apr 16 18:03:40.007648 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:40.007630 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/660169ed-81f1-4fe6-bec7-728c78cf4f99-signing-key\") pod \"service-ca-bfc587fb7-wgb6h\" (UID: \"660169ed-81f1-4fe6-bec7-728c78cf4f99\") " pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" Apr 16 18:03:40.013392 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:40.013368 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wtpn\" (UniqueName: \"kubernetes.io/projected/660169ed-81f1-4fe6-bec7-728c78cf4f99-kube-api-access-9wtpn\") pod \"service-ca-bfc587fb7-wgb6h\" (UID: \"660169ed-81f1-4fe6-bec7-728c78cf4f99\") " pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" Apr 16 18:03:40.110015 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:40.109983 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" Apr 16 18:03:40.227810 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:40.227780 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-wgb6h"] Apr 16 18:03:40.230830 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:03:40.230803 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660169ed_81f1_4fe6_bec7_728c78cf4f99.slice/crio-affc9f805b48eb8996419a17293d3187a077b0685c5cf0e3bcd9c1425f35421b WatchSource:0}: Error finding container affc9f805b48eb8996419a17293d3187a077b0685c5cf0e3bcd9c1425f35421b: Status 404 returned error can't find the container with id affc9f805b48eb8996419a17293d3187a077b0685c5cf0e3bcd9c1425f35421b Apr 16 18:03:40.442670 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:40.442630 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw" event={"ID":"7b731140-2291-4c78-96fd-b58bafe82dc6","Type":"ContainerStarted","Data":"d058f9608f98196d3ddf655b1bc3a0c5834a33b0d6466abbcf140703a4966486"} Apr 16 18:03:40.443858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:40.443822 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" event={"ID":"660169ed-81f1-4fe6-bec7-728c78cf4f99","Type":"ContainerStarted","Data":"affc9f805b48eb8996419a17293d3187a077b0685c5cf0e3bcd9c1425f35421b"} Apr 16 18:03:40.758926 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:40.758853 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-q8z42_ab0cc5c6-ec31-4ed7-a4fa-9968260fc075/node-ca/0.log" Apr 16 18:03:41.447071 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:41.447042 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw" event={"ID":"7b731140-2291-4c78-96fd-b58bafe82dc6","Type":"ContainerStarted","Data":"406b6b4dae03ecec221fe424096c2465d8de3164d1b926dbfd4e1d0812e28c77"} Apr 16 18:03:41.447071 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:41.447079 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw" event={"ID":"7b731140-2291-4c78-96fd-b58bafe82dc6","Type":"ContainerStarted","Data":"c9453ddb0f82f52308f11d72c033afeee0c0844fc16c1fdd05468bc6b9799577"} Apr 16 18:03:41.464732 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:41.464687 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-t72vw" podStartSLOduration=0.819110902 podStartE2EDuration="2.464674532s" podCreationTimestamp="2026-04-16 18:03:39 +0000 UTC" firstStartedPulling="2026-04-16 18:03:39.574220869 +0000 UTC m=+45.944894753" lastFinishedPulling="2026-04-16 18:03:41.219784488 +0000 UTC m=+47.590458383" observedRunningTime="2026-04-16 18:03:41.463854717 +0000 UTC m=+47.834528624" watchObservedRunningTime="2026-04-16 18:03:41.464674532 +0000 UTC m=+47.835348439" Apr 16 18:03:41.920697 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:41.920653 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:41.920903 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:41.920831 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:41.920903 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:41.920854 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5d459db89c-vh5fb: secret "image-registry-tls" not found Apr 16 18:03:41.921025 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:41.920926 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls podName:5b47d4c2-caca-46dd-b761-e865026dbdb9 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:57.920904112 +0000 UTC m=+64.291578020 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls") pod "image-registry-5d459db89c-vh5fb" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9") : secret "image-registry-tls" not found Apr 16 18:03:42.021560 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:42.021519 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:42.021724 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:42.021678 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:42.021766 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:42.021739 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls podName:8abd6ace-6d1c-4823-ba9a-f0f8e17c654c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:58.021722686 +0000 UTC m=+64.392396591 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls") pod "dns-default-2t59p" (UID: "8abd6ace-6d1c-4823-ba9a-f0f8e17c654c") : secret "dns-default-metrics-tls" not found Apr 16 18:03:42.122646 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:42.122598 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:42.122807 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:42.122746 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:42.122850 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:42.122811 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert podName:5a9c1147-51d3-4765-a356-5a32f7bc0b67 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:58.122794635 +0000 UTC m=+64.493468540 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert") pod "ingress-canary-2htvb" (UID: "5a9c1147-51d3-4765-a356-5a32f7bc0b67") : secret "canary-serving-cert" not found Apr 16 18:03:43.429194 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:43.429161 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:03:43.452115 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:43.452082 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" event={"ID":"660169ed-81f1-4fe6-bec7-728c78cf4f99","Type":"ContainerStarted","Data":"4eac27b78767883242748782dacad9e58c400baae4b6ec3bc52327aa6ff751f2"} Apr 16 18:03:43.470381 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:43.470338 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-wgb6h" podStartSLOduration=2.350688965 podStartE2EDuration="4.470324429s" podCreationTimestamp="2026-04-16 18:03:39 +0000 UTC" firstStartedPulling="2026-04-16 18:03:40.232672881 +0000 UTC m=+46.603346769" lastFinishedPulling="2026-04-16 18:03:42.352308344 +0000 UTC m=+48.722982233" observedRunningTime="2026-04-16 18:03:43.469822179 +0000 UTC m=+49.840496087" watchObservedRunningTime="2026-04-16 18:03:43.470324429 +0000 UTC m=+49.840998381" Apr 16 18:03:44.642252 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:44.642208 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:03:44.642817 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:44.642405 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:03:44.642817 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:03:44.642483 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert podName:4aea7302-70a3-484c-bdf8-4a745df33639 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:00.642462658 +0000 UTC m=+67.013136549 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-pffzm" (UID: "4aea7302-70a3-484c-bdf8-4a745df33639") : secret "networking-console-plugin-cert" not found Apr 16 18:03:52.398970 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:52.398942 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxh6d" Apr 16 18:03:57.942052 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:57.942014 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:57.944372 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:57.944341 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls\") pod \"image-registry-5d459db89c-vh5fb\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:58.042989 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.042952 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:58.055545 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.055517 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8abd6ace-6d1c-4823-ba9a-f0f8e17c654c-metrics-tls\") pod \"dns-default-2t59p\" (UID: \"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c\") " pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:58.143693 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.143660 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:58.145933 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.145913 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a9c1147-51d3-4765-a356-5a32f7bc0b67-cert\") pod \"ingress-canary-2htvb\" (UID: \"5a9c1147-51d3-4765-a356-5a32f7bc0b67\") " pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:58.195148 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.195075 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dnqxl\"" Apr 16 18:03:58.203412 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.203394 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:58.320755 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.320730 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zzqks\"" Apr 16 18:03:58.324461 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.324438 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d459db89c-vh5fb"] Apr 16 18:03:58.327773 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:03:58.327747 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b47d4c2_caca_46dd_b761_e865026dbdb9.slice/crio-f0d87a45cba9d0b73a37d4b575c800e359c98f4cf68b363743f5520733e8b4ee WatchSource:0}: Error finding container f0d87a45cba9d0b73a37d4b575c800e359c98f4cf68b363743f5520733e8b4ee: Status 404 returned error can't find the container with id f0d87a45cba9d0b73a37d4b575c800e359c98f4cf68b363743f5520733e8b4ee Apr 16 18:03:58.328264 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.328236 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2t59p" Apr 16 18:03:58.351668 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.351642 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fkw8n\"" Apr 16 18:03:58.359077 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.359052 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2htvb" Apr 16 18:03:58.462653 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.462587 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2t59p"] Apr 16 18:03:58.466785 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:03:58.466750 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8abd6ace_6d1c_4823_ba9a_f0f8e17c654c.slice/crio-591da64d5d378835e6b12d58d05a8aa23bb570b62665403aa5af7382f32a8d6f WatchSource:0}: Error finding container 591da64d5d378835e6b12d58d05a8aa23bb570b62665403aa5af7382f32a8d6f: Status 404 returned error can't find the container with id 591da64d5d378835e6b12d58d05a8aa23bb570b62665403aa5af7382f32a8d6f Apr 16 18:03:58.489486 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.489461 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2htvb"] Apr 16 18:03:58.489793 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.489698 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" event={"ID":"5b47d4c2-caca-46dd-b761-e865026dbdb9","Type":"ContainerStarted","Data":"87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2"} Apr 16 18:03:58.489793 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.489731 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" event={"ID":"5b47d4c2-caca-46dd-b761-e865026dbdb9","Type":"ContainerStarted","Data":"f0d87a45cba9d0b73a37d4b575c800e359c98f4cf68b363743f5520733e8b4ee"} Apr 16 18:03:58.490672 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.490650 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2t59p" event={"ID":"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c","Type":"ContainerStarted","Data":"591da64d5d378835e6b12d58d05a8aa23bb570b62665403aa5af7382f32a8d6f"} Apr 16 18:03:58.492330 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:03:58.492309 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a9c1147_51d3_4765_a356_5a32f7bc0b67.slice/crio-f296bb8341a39fe1777a24c00200151968b84ab20c43d3bd1826c453972865cc WatchSource:0}: Error finding container f296bb8341a39fe1777a24c00200151968b84ab20c43d3bd1826c453972865cc: Status 404 returned error can't find the container with id f296bb8341a39fe1777a24c00200151968b84ab20c43d3bd1826c453972865cc Apr 16 18:03:58.850215 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.850140 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:58.852582 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.852556 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47436224-f5c2-417d-bfdc-61bfb9b8939a-metrics-certs\") pod \"network-metrics-daemon-2xjws\" (UID: \"47436224-f5c2-417d-bfdc-61bfb9b8939a\") " pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:58.964711 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.964682 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qtxn\"" Apr 16 18:03:58.972857 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:58.972828 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2xjws" Apr 16 18:03:59.118020 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:59.117941 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2xjws"] Apr 16 18:03:59.122764 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:03:59.122732 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47436224_f5c2_417d_bfdc_61bfb9b8939a.slice/crio-780264cfe8e8075e1bfa42e64e08e35706ce7a5164552bb9fef683a3d50d40a1 WatchSource:0}: Error finding container 780264cfe8e8075e1bfa42e64e08e35706ce7a5164552bb9fef683a3d50d40a1: Status 404 returned error can't find the container with id 780264cfe8e8075e1bfa42e64e08e35706ce7a5164552bb9fef683a3d50d40a1 Apr 16 18:03:59.496208 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:59.496168 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2htvb" event={"ID":"5a9c1147-51d3-4765-a356-5a32f7bc0b67","Type":"ContainerStarted","Data":"f296bb8341a39fe1777a24c00200151968b84ab20c43d3bd1826c453972865cc"} Apr 16 18:03:59.497273 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:59.497243 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2xjws" event={"ID":"47436224-f5c2-417d-bfdc-61bfb9b8939a","Type":"ContainerStarted","Data":"780264cfe8e8075e1bfa42e64e08e35706ce7a5164552bb9fef683a3d50d40a1"} Apr 16 18:03:59.497495 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:59.497408 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:03:59.523252 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:03:59.523198 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" podStartSLOduration=61.523179863 podStartE2EDuration="1m1.523179863s" podCreationTimestamp="2026-04-16 18:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:03:59.52184396 +0000 UTC m=+65.892517868" watchObservedRunningTime="2026-04-16 18:03:59.523179863 +0000 UTC m=+65.893853771" Apr 16 18:04:00.667238 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:00.667190 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:04:00.669549 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:00.669523 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4aea7302-70a3-484c-bdf8-4a745df33639-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-pffzm\" (UID: \"4aea7302-70a3-484c-bdf8-4a745df33639\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:04:00.925014 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:00.924936 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tm7vs\"" Apr 16 18:04:00.932460 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:00.932425 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" Apr 16 18:04:01.060693 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.060662 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wlbw7"] Apr 16 18:04:01.096637 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.096593 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wlbw7"] Apr 16 18:04:01.096779 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.096674 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.101881 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.100714 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:04:01.101881 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.100767 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:04:01.101881 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.101802 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nw878\"" Apr 16 18:04:01.102095 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.102049 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:04:01.103717 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.102296 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:04:01.114691 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.114670 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d459db89c-vh5fb"] Apr 16 18:04:01.160456 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.160424 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn"] Apr 16 18:04:01.169846 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.169816 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/83dbfa60-19ba-415c-9df9-d865b7b18be8-crio-socket\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.169963 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.169859 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/83dbfa60-19ba-415c-9df9-d865b7b18be8-data-volume\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.169963 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.169884 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/83dbfa60-19ba-415c-9df9-d865b7b18be8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.169963 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.169911 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/83dbfa60-19ba-415c-9df9-d865b7b18be8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.169963 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.169951 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl5ht\" (UniqueName: \"kubernetes.io/projected/83dbfa60-19ba-415c-9df9-d865b7b18be8-kube-api-access-jl5ht\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.193263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.193229 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-9vgql"] Apr 16 18:04:01.193263 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.193259 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn" Apr 16 18:04:01.195827 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.195806 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:04:01.195955 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.195873 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-j52ld\"" Apr 16 18:04:01.210114 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.209844 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6984c869b6-m8kb6"] Apr 16 18:04:01.210114 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.209984 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-9vgql" Apr 16 18:04:01.215086 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.214018 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-5znbm\"" Apr 16 18:04:01.215086 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.214233 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:04:01.215086 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.214485 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:04:01.250680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.248223 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn"] Apr 16 18:04:01.250680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.248255 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-9vgql"] Apr 16 18:04:01.250680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.248271 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6984c869b6-m8kb6"] Apr 16 18:04:01.250680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.248400 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270642 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-trusted-ca\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270686 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-bound-sa-token\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270723 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/83dbfa60-19ba-415c-9df9-d865b7b18be8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270751 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-installation-pull-secrets\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270778 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-image-registry-private-configuration\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270815 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/83dbfa60-19ba-415c-9df9-d865b7b18be8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270841 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/83dbfa60-19ba-415c-9df9-d865b7b18be8-data-volume\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270873 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-ca-trust-extracted\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270903 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/41432f2c-9579-47bd-9382-9700b101f44d-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-h49fn\" (UID: \"41432f2c-9579-47bd-9382-9700b101f44d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270935 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jl5ht\" (UniqueName: \"kubernetes.io/projected/83dbfa60-19ba-415c-9df9-d865b7b18be8-kube-api-access-jl5ht\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270958 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25tw8\" (UniqueName: \"kubernetes.io/projected/e4fac77e-76ed-47ba-a915-a3fad29536cf-kube-api-access-25tw8\") pod \"downloads-586b57c7b4-9vgql\" (UID: \"e4fac77e-76ed-47ba-a915-a3fad29536cf\") " pod="openshift-console/downloads-586b57c7b4-9vgql" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.270989 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-registry-certificates\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.271023 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-registry-tls\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.271049 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5bt\" (UniqueName: \"kubernetes.io/projected/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-kube-api-access-4p5bt\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.271831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.271094 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/83dbfa60-19ba-415c-9df9-d865b7b18be8-crio-socket\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.272818 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.271386 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/83dbfa60-19ba-415c-9df9-d865b7b18be8-crio-socket\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.273021 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.272997 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/83dbfa60-19ba-415c-9df9-d865b7b18be8-data-volume\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.273194 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.273151 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/83dbfa60-19ba-415c-9df9-d865b7b18be8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.275857 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.275831 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/83dbfa60-19ba-415c-9df9-d865b7b18be8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.297128 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.297108 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl5ht\" (UniqueName: \"kubernetes.io/projected/83dbfa60-19ba-415c-9df9-d865b7b18be8-kube-api-access-jl5ht\") pod \"insights-runtime-extractor-wlbw7\" (UID: \"83dbfa60-19ba-415c-9df9-d865b7b18be8\") " pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.308372 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.308350 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm"] Apr 16 18:04:01.310915 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:04:01.310891 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aea7302_70a3_484c_bdf8_4a745df33639.slice/crio-6c564c99b1fbe9340da8d9c6029525d0161f2c263e3045e8a39b4187d514fbc1 WatchSource:0}: Error finding container 6c564c99b1fbe9340da8d9c6029525d0161f2c263e3045e8a39b4187d514fbc1: Status 404 returned error can't find the container with id 6c564c99b1fbe9340da8d9c6029525d0161f2c263e3045e8a39b4187d514fbc1 Apr 16 18:04:01.371676 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.371647 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-trusted-ca\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.371676 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.371684 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-bound-sa-token\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.371907 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.371705 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-installation-pull-secrets\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.371907 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.371724 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-image-registry-private-configuration\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.371907 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.371746 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-ca-trust-extracted\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.371907 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.371764 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/41432f2c-9579-47bd-9382-9700b101f44d-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-h49fn\" (UID: \"41432f2c-9579-47bd-9382-9700b101f44d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn" Apr 16 18:04:01.372104 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.371919 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25tw8\" (UniqueName: \"kubernetes.io/projected/e4fac77e-76ed-47ba-a915-a3fad29536cf-kube-api-access-25tw8\") pod \"downloads-586b57c7b4-9vgql\" (UID: \"e4fac77e-76ed-47ba-a915-a3fad29536cf\") " pod="openshift-console/downloads-586b57c7b4-9vgql" Apr 16 18:04:01.372104 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.371947 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-registry-certificates\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.372104 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.371973 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-registry-tls\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.372104 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.371997 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5bt\" (UniqueName: \"kubernetes.io/projected/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-kube-api-access-4p5bt\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.372911 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.372659 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-ca-trust-extracted\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.372911 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.372795 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-trusted-ca\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.373225 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.373004 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-registry-certificates\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.374875 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.374844 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-registry-tls\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.374986 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.374953 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/41432f2c-9579-47bd-9382-9700b101f44d-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-h49fn\" (UID: \"41432f2c-9579-47bd-9382-9700b101f44d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn" Apr 16 18:04:01.375370 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.375350 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-image-registry-private-configuration\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.375462 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.375442 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-installation-pull-secrets\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.380622 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.380582 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5bt\" (UniqueName: \"kubernetes.io/projected/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-kube-api-access-4p5bt\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.380780 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.380764 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25tw8\" (UniqueName: \"kubernetes.io/projected/e4fac77e-76ed-47ba-a915-a3fad29536cf-kube-api-access-25tw8\") pod \"downloads-586b57c7b4-9vgql\" (UID: \"e4fac77e-76ed-47ba-a915-a3fad29536cf\") " pod="openshift-console/downloads-586b57c7b4-9vgql" Apr 16 18:04:01.382111 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.382091 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a80ad2ed-b486-42ea-b4ae-c06c47c167f5-bound-sa-token\") pod \"image-registry-6984c869b6-m8kb6\" (UID: \"a80ad2ed-b486-42ea-b4ae-c06c47c167f5\") " pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:01.409527 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.409490 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wlbw7" Apr 16 18:04:01.503830 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.503759 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2t59p" event={"ID":"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c","Type":"ContainerStarted","Data":"0d9e9453052d312105d944513ccdf316d8356ea6adcf942fee12a363abd0acec"} Apr 16 18:04:01.504217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.504199 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn" Apr 16 18:04:01.505041 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.505018 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2htvb" event={"ID":"5a9c1147-51d3-4765-a356-5a32f7bc0b67","Type":"ContainerStarted","Data":"ab3c4b5a18641e95e9e276967f27dade72347a6cb45aaba7abaa3e4d1e103fe8"} Apr 16 18:04:01.506200 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.506178 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" event={"ID":"4aea7302-70a3-484c-bdf8-4a745df33639","Type":"ContainerStarted","Data":"6c564c99b1fbe9340da8d9c6029525d0161f2c263e3045e8a39b4187d514fbc1"} Apr 16 18:04:01.524625 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.524570 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2htvb" podStartSLOduration=32.848234352 podStartE2EDuration="35.524558627s" podCreationTimestamp="2026-04-16 18:03:26 +0000 UTC" firstStartedPulling="2026-04-16 18:03:58.493979252 +0000 UTC m=+64.864653138" lastFinishedPulling="2026-04-16 18:04:01.170303522 +0000 UTC m=+67.540977413" observedRunningTime="2026-04-16 18:04:01.523472487 +0000 UTC m=+67.894146395" watchObservedRunningTime="2026-04-16 18:04:01.524558627 +0000 UTC m=+67.895232533" Apr 16 18:04:01.528652 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.528634 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-9vgql" Apr 16 18:04:01.576760 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:01.576735 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:02.018001 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.017975 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wlbw7"] Apr 16 18:04:02.020698 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:04:02.020672 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83dbfa60_19ba_415c_9df9_d865b7b18be8.slice/crio-76b97204c30daaa4f53d6e921e5e38a283a81adcd963d236dd938ab0f102ac5b WatchSource:0}: Error finding container 76b97204c30daaa4f53d6e921e5e38a283a81adcd963d236dd938ab0f102ac5b: Status 404 returned error can't find the container with id 76b97204c30daaa4f53d6e921e5e38a283a81adcd963d236dd938ab0f102ac5b Apr 16 18:04:02.029181 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.029160 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6984c869b6-m8kb6"] Apr 16 18:04:02.031415 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:04:02.031360 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda80ad2ed_b486_42ea_b4ae_c06c47c167f5.slice/crio-ef5e38c955cdfa6242e71efec212a747dbb785bcb4af9b1f5d1ddd461b73f0d7 WatchSource:0}: Error finding container ef5e38c955cdfa6242e71efec212a747dbb785bcb4af9b1f5d1ddd461b73f0d7: Status 404 returned error can't find the container with id ef5e38c955cdfa6242e71efec212a747dbb785bcb4af9b1f5d1ddd461b73f0d7 Apr 16 18:04:02.032102 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.032079 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-9vgql"] Apr 16 18:04:02.037139 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:04:02.037114 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4fac77e_76ed_47ba_a915_a3fad29536cf.slice/crio-e0e2c83a29ed962a7dfa9bc38caa44ec741a0a51e748a76dbd1b5e416b6718bf WatchSource:0}: Error finding container e0e2c83a29ed962a7dfa9bc38caa44ec741a0a51e748a76dbd1b5e416b6718bf: Status 404 returned error can't find the container with id e0e2c83a29ed962a7dfa9bc38caa44ec741a0a51e748a76dbd1b5e416b6718bf Apr 16 18:04:02.046960 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.046939 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn"] Apr 16 18:04:02.049574 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:04:02.049552 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41432f2c_9579_47bd_9382_9700b101f44d.slice/crio-316253a28469bf8995dd9b4d019519aa81a6538b61598ce3ab65c097c19a6755 WatchSource:0}: Error finding container 316253a28469bf8995dd9b4d019519aa81a6538b61598ce3ab65c097c19a6755: Status 404 returned error can't find the container with id 316253a28469bf8995dd9b4d019519aa81a6538b61598ce3ab65c097c19a6755 Apr 16 18:04:02.485676 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.485645 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b44bdbc6c-bgbcg"] Apr 16 18:04:02.500983 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.500953 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.505076 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.504889 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:04:02.505076 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.504945 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:04:02.505882 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.505597 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:04:02.505882 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.505651 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b44bdbc6c-bgbcg"] Apr 16 18:04:02.505882 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.505815 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:04:02.505882 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.505831 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:04:02.506188 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.505967 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bxlzl\"" Apr 16 18:04:02.511743 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.511701 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2t59p" event={"ID":"8abd6ace-6d1c-4823-ba9a-f0f8e17c654c","Type":"ContainerStarted","Data":"1880053da0b52f8d48deb6c32d1bfa37678ffd22b302011e4ea3c82edb1800e1"} Apr 16 18:04:02.511871 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.511855 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2t59p" Apr 16 18:04:02.513393 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.513367 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" event={"ID":"a80ad2ed-b486-42ea-b4ae-c06c47c167f5","Type":"ContainerStarted","Data":"d7509d01d004faeee597d8a83c923adb611cdeec87e9dafa7aaa23a001259d56"} Apr 16 18:04:02.513508 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.513399 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" event={"ID":"a80ad2ed-b486-42ea-b4ae-c06c47c167f5","Type":"ContainerStarted","Data":"ef5e38c955cdfa6242e71efec212a747dbb785bcb4af9b1f5d1ddd461b73f0d7"} Apr 16 18:04:02.513508 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.513494 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:02.514664 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.514636 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-9vgql" event={"ID":"e4fac77e-76ed-47ba-a915-a3fad29536cf","Type":"ContainerStarted","Data":"e0e2c83a29ed962a7dfa9bc38caa44ec741a0a51e748a76dbd1b5e416b6718bf"} Apr 16 18:04:02.516403 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.516380 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2xjws" event={"ID":"47436224-f5c2-417d-bfdc-61bfb9b8939a","Type":"ContainerStarted","Data":"47404a8b2c9f163403633e7e9c0b0198e644bdfd01cc18720f66160fe7b79de8"} Apr 16 18:04:02.516501 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.516409 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2xjws" event={"ID":"47436224-f5c2-417d-bfdc-61bfb9b8939a","Type":"ContainerStarted","Data":"cf614fd5eec9d54010a0d5d0918bf52a347d9706b7a718508a92d08396217877"} Apr 16 18:04:02.517493 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.517474 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn" event={"ID":"41432f2c-9579-47bd-9382-9700b101f44d","Type":"ContainerStarted","Data":"316253a28469bf8995dd9b4d019519aa81a6538b61598ce3ab65c097c19a6755"} Apr 16 18:04:02.519186 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.519102 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wlbw7" event={"ID":"83dbfa60-19ba-415c-9df9-d865b7b18be8","Type":"ContainerStarted","Data":"a929565778049d650967e5563c847eda17ef6e0e8944f3cdfa7a2ee3989b702e"} Apr 16 18:04:02.519186 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.519167 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wlbw7" event={"ID":"83dbfa60-19ba-415c-9df9-d865b7b18be8","Type":"ContainerStarted","Data":"76b97204c30daaa4f53d6e921e5e38a283a81adcd963d236dd938ab0f102ac5b"} Apr 16 18:04:02.536475 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.536428 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2xjws" podStartSLOduration=66.064156088 podStartE2EDuration="1m8.536413042s" podCreationTimestamp="2026-04-16 18:02:54 +0000 UTC" firstStartedPulling="2026-04-16 18:03:59.125144464 +0000 UTC m=+65.495818364" lastFinishedPulling="2026-04-16 18:04:01.597401427 +0000 UTC m=+67.968075318" observedRunningTime="2026-04-16 18:04:02.535565106 +0000 UTC m=+68.906239013" watchObservedRunningTime="2026-04-16 18:04:02.536413042 +0000 UTC m=+68.907086953" Apr 16 18:04:02.558182 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.557975 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2t59p" podStartSLOduration=33.859926806 podStartE2EDuration="36.557957673s" podCreationTimestamp="2026-04-16 18:03:26 +0000 UTC" firstStartedPulling="2026-04-16 18:03:58.468641829 +0000 UTC m=+64.839315714" lastFinishedPulling="2026-04-16 18:04:01.166672681 +0000 UTC m=+67.537346581" observedRunningTime="2026-04-16 18:04:02.556841764 +0000 UTC m=+68.927515672" watchObservedRunningTime="2026-04-16 18:04:02.557957673 +0000 UTC m=+68.928631582" Apr 16 18:04:02.582481 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.582365 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" podStartSLOduration=1.582345943 podStartE2EDuration="1.582345943s" podCreationTimestamp="2026-04-16 18:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:02.58124307 +0000 UTC m=+68.951916989" watchObservedRunningTime="2026-04-16 18:04:02.582345943 +0000 UTC m=+68.953019848" Apr 16 18:04:02.583086 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.583054 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwsjb\" (UniqueName: \"kubernetes.io/projected/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-kube-api-access-jwsjb\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.583214 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.583098 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-config\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.583301 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.583281 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-serving-cert\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.583524 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.583505 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-oauth-config\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.583964 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.583945 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-service-ca\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.584626 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.584592 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-oauth-serving-cert\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.686291 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.686245 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwsjb\" (UniqueName: \"kubernetes.io/projected/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-kube-api-access-jwsjb\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.686459 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.686297 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-config\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.686459 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.686336 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-serving-cert\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.686459 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.686373 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-oauth-config\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.686459 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.686400 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-service-ca\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.686459 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.686437 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-oauth-serving-cert\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.688145 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.687379 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-oauth-serving-cert\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.688145 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.687778 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-config\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.688545 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.688482 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-service-ca\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.694273 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.694213 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-oauth-config\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.694372 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.694329 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-serving-cert\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.697169 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.696891 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwsjb\" (UniqueName: \"kubernetes.io/projected/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-kube-api-access-jwsjb\") pod \"console-5b44bdbc6c-bgbcg\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:02.814284 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:02.814205 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:03.062518 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:03.062482 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b44bdbc6c-bgbcg"] Apr 16 18:04:03.067434 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:04:03.067405 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc72c4323_89ec_4eb5_8ec0_5ec439d13c7e.slice/crio-33c7dab5aa1fa749e137c7e6518af420fdb4d8387cce862082e8e8fb41e2aced WatchSource:0}: Error finding container 33c7dab5aa1fa749e137c7e6518af420fdb4d8387cce862082e8e8fb41e2aced: Status 404 returned error can't find the container with id 33c7dab5aa1fa749e137c7e6518af420fdb4d8387cce862082e8e8fb41e2aced Apr 16 18:04:03.432346 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:03.432262 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xddxf" Apr 16 18:04:03.523006 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:03.522967 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b44bdbc6c-bgbcg" event={"ID":"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e","Type":"ContainerStarted","Data":"33c7dab5aa1fa749e137c7e6518af420fdb4d8387cce862082e8e8fb41e2aced"} Apr 16 18:04:04.528413 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:04.528209 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn" event={"ID":"41432f2c-9579-47bd-9382-9700b101f44d","Type":"ContainerStarted","Data":"54fc4ca974e16c1d7d5ab6f016d2e5e41bae2347abd509b616c66cf9b6331f1a"} Apr 16 18:04:04.528885 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:04.528760 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn" Apr 16 18:04:04.532224 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:04.532192 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wlbw7" event={"ID":"83dbfa60-19ba-415c-9df9-d865b7b18be8","Type":"ContainerStarted","Data":"02ff36cdacc0663bcd0b8a7ee6fcdf3c6529722205953aff7d74752c45c03b7b"} Apr 16 18:04:04.533858 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:04.533816 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" event={"ID":"4aea7302-70a3-484c-bdf8-4a745df33639","Type":"ContainerStarted","Data":"81f070db80471b988e2659ea6b1b74d1d93e51277699f3f86e96ca248fa9cdb6"} Apr 16 18:04:04.534479 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:04.534457 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn" Apr 16 18:04:04.545288 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:04.545237 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h49fn" podStartSLOduration=1.532210109 podStartE2EDuration="3.545219298s" podCreationTimestamp="2026-04-16 18:04:01 +0000 UTC" firstStartedPulling="2026-04-16 18:04:02.051515538 +0000 UTC m=+68.422189424" lastFinishedPulling="2026-04-16 18:04:04.064524725 +0000 UTC m=+70.435198613" observedRunningTime="2026-04-16 18:04:04.544959338 +0000 UTC m=+70.915633246" watchObservedRunningTime="2026-04-16 18:04:04.545219298 +0000 UTC m=+70.915893205" Apr 16 18:04:04.561860 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:04.561801 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-pffzm" podStartSLOduration=33.817206583 podStartE2EDuration="36.561783745s" podCreationTimestamp="2026-04-16 18:03:28 +0000 UTC" firstStartedPulling="2026-04-16 18:04:01.312743492 +0000 UTC m=+67.683417376" lastFinishedPulling="2026-04-16 18:04:04.057320633 +0000 UTC m=+70.427994538" observedRunningTime="2026-04-16 18:04:04.560640862 +0000 UTC m=+70.931314770" watchObservedRunningTime="2026-04-16 18:04:04.561783745 +0000 UTC m=+70.932457654" Apr 16 18:04:05.538076 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.536851 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-lbk7p"] Apr 16 18:04:05.542069 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.541692 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.545876 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.545853 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:04:05.545974 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.545892 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gr7gh\"" Apr 16 18:04:05.545974 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.545932 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:04:05.545974 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.545956 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:04:05.545974 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.545893 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:04:05.546217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.545855 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:04:05.548386 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.548350 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-lbk7p"] Apr 16 18:04:05.611582 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.611547 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d929efe8-7163-4d2b-96b5-d313e3e8d987-metrics-client-ca\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.611781 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.611632 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmmtq\" (UniqueName: \"kubernetes.io/projected/d929efe8-7163-4d2b-96b5-d313e3e8d987-kube-api-access-dmmtq\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.611781 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.611667 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d929efe8-7163-4d2b-96b5-d313e3e8d987-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.611781 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.611724 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d929efe8-7163-4d2b-96b5-d313e3e8d987-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.712695 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.712657 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmmtq\" (UniqueName: \"kubernetes.io/projected/d929efe8-7163-4d2b-96b5-d313e3e8d987-kube-api-access-dmmtq\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.712695 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.712698 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d929efe8-7163-4d2b-96b5-d313e3e8d987-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.712937 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.712764 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d929efe8-7163-4d2b-96b5-d313e3e8d987-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.712937 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.712830 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d929efe8-7163-4d2b-96b5-d313e3e8d987-metrics-client-ca\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.713713 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.713658 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d929efe8-7163-4d2b-96b5-d313e3e8d987-metrics-client-ca\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.715953 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.715793 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d929efe8-7163-4d2b-96b5-d313e3e8d987-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.715953 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.715903 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d929efe8-7163-4d2b-96b5-d313e3e8d987-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.721459 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.721426 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmmtq\" (UniqueName: \"kubernetes.io/projected/d929efe8-7163-4d2b-96b5-d313e3e8d987-kube-api-access-dmmtq\") pod \"prometheus-operator-78f957474d-lbk7p\" (UID: \"d929efe8-7163-4d2b-96b5-d313e3e8d987\") " pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:05.856658 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:05.856551 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" Apr 16 18:04:07.269279 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:07.269248 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-lbk7p"] Apr 16 18:04:07.273702 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:04:07.273649 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd929efe8_7163_4d2b_96b5_d313e3e8d987.slice/crio-dacdecc8db855aed8412f16c7688820d973a34061440a41828dbe8e6e1ab8575 WatchSource:0}: Error finding container dacdecc8db855aed8412f16c7688820d973a34061440a41828dbe8e6e1ab8575: Status 404 returned error can't find the container with id dacdecc8db855aed8412f16c7688820d973a34061440a41828dbe8e6e1ab8575 Apr 16 18:04:07.548400 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:07.548359 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wlbw7" event={"ID":"83dbfa60-19ba-415c-9df9-d865b7b18be8","Type":"ContainerStarted","Data":"bfe895ae65c51b9fcd47db6677d9d86e430dede2f895ee62cbe6ae0f1932a5fe"} Apr 16 18:04:07.549631 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:07.549583 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" event={"ID":"d929efe8-7163-4d2b-96b5-d313e3e8d987","Type":"ContainerStarted","Data":"dacdecc8db855aed8412f16c7688820d973a34061440a41828dbe8e6e1ab8575"} Apr 16 18:04:07.551043 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:07.551019 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b44bdbc6c-bgbcg" event={"ID":"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e","Type":"ContainerStarted","Data":"a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb"} Apr 16 18:04:07.571395 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:07.571341 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wlbw7" podStartSLOduration=1.591815239 podStartE2EDuration="6.571323372s" podCreationTimestamp="2026-04-16 18:04:01 +0000 UTC" firstStartedPulling="2026-04-16 18:04:02.145091949 +0000 UTC m=+68.515765833" lastFinishedPulling="2026-04-16 18:04:07.124600065 +0000 UTC m=+73.495273966" observedRunningTime="2026-04-16 18:04:07.569836967 +0000 UTC m=+73.940510889" watchObservedRunningTime="2026-04-16 18:04:07.571323372 +0000 UTC m=+73.941997280" Apr 16 18:04:07.590517 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:07.590462 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b44bdbc6c-bgbcg" podStartSLOduration=1.527621318 podStartE2EDuration="5.590445696s" podCreationTimestamp="2026-04-16 18:04:02 +0000 UTC" firstStartedPulling="2026-04-16 18:04:03.069896513 +0000 UTC m=+69.440570399" lastFinishedPulling="2026-04-16 18:04:07.132720877 +0000 UTC m=+73.503394777" observedRunningTime="2026-04-16 18:04:07.589660363 +0000 UTC m=+73.960334271" watchObservedRunningTime="2026-04-16 18:04:07.590445696 +0000 UTC m=+73.961119602" Apr 16 18:04:09.558779 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:09.558737 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" event={"ID":"d929efe8-7163-4d2b-96b5-d313e3e8d987","Type":"ContainerStarted","Data":"285078498ef34e7b7fabfcec4c8dcbc86266381ab0ffc26f7628c6780b411ee9"} Apr 16 18:04:09.559221 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:09.558787 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" event={"ID":"d929efe8-7163-4d2b-96b5-d313e3e8d987","Type":"ContainerStarted","Data":"816595f103d356cfaf4b452b2ef6e876769c6fd918ceb013474107da0adae8a7"} Apr 16 18:04:09.577521 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:09.577469 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-lbk7p" podStartSLOduration=3.156780606 podStartE2EDuration="4.577453513s" podCreationTimestamp="2026-04-16 18:04:05 +0000 UTC" firstStartedPulling="2026-04-16 18:04:07.27550203 +0000 UTC m=+73.646175915" lastFinishedPulling="2026-04-16 18:04:08.696174926 +0000 UTC m=+75.066848822" observedRunningTime="2026-04-16 18:04:09.575586427 +0000 UTC m=+75.946260333" watchObservedRunningTime="2026-04-16 18:04:09.577453513 +0000 UTC m=+75.948127419" Apr 16 18:04:12.069214 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.069079 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-z5wsv"] Apr 16 18:04:12.082067 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.082035 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.084795 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.084770 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:04:12.084927 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.084781 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:04:12.085471 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.085415 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-dln79\"" Apr 16 18:04:12.085577 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.085480 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:04:12.173841 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.173800 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-accelerators-collector-config\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.174011 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.173847 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srv4m\" (UniqueName: \"kubernetes.io/projected/76cd3b9c-5d86-460e-a147-c0872c568583-kube-api-access-srv4m\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.174011 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.173884 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-tls\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.174011 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.173976 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.174165 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.174026 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76cd3b9c-5d86-460e-a147-c0872c568583-sys\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.174165 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.174099 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-textfile\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.174165 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.174157 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-wtmp\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.174266 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.174224 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76cd3b9c-5d86-460e-a147-c0872c568583-metrics-client-ca\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.174319 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.174268 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/76cd3b9c-5d86-460e-a147-c0872c568583-root\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.275196 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275159 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.275345 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275216 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76cd3b9c-5d86-460e-a147-c0872c568583-sys\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.275345 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275280 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76cd3b9c-5d86-460e-a147-c0872c568583-sys\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.275345 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275316 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-textfile\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.275345 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275338 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-wtmp\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.275549 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275367 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76cd3b9c-5d86-460e-a147-c0872c568583-metrics-client-ca\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.275549 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275392 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/76cd3b9c-5d86-460e-a147-c0872c568583-root\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.275549 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275414 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-accelerators-collector-config\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.275549 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275430 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srv4m\" (UniqueName: \"kubernetes.io/projected/76cd3b9c-5d86-460e-a147-c0872c568583-kube-api-access-srv4m\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.275549 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275451 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-tls\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.275549 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:04:12.275539 2564 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:04:12.275843 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:04:12.275581 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-tls podName:76cd3b9c-5d86-460e-a147-c0872c568583 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:12.775566881 +0000 UTC m=+79.146240767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-tls") pod "node-exporter-z5wsv" (UID: "76cd3b9c-5d86-460e-a147-c0872c568583") : secret "node-exporter-tls" not found Apr 16 18:04:12.275843 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275663 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/76cd3b9c-5d86-460e-a147-c0872c568583-root\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.276376 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.275962 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-textfile\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.276376 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.276081 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-wtmp\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.276376 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.276370 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-accelerators-collector-config\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.276813 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.276791 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76cd3b9c-5d86-460e-a147-c0872c568583-metrics-client-ca\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.278085 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.278040 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.284711 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.284669 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srv4m\" (UniqueName: \"kubernetes.io/projected/76cd3b9c-5d86-460e-a147-c0872c568583-kube-api-access-srv4m\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.525572 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.525543 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2t59p" Apr 16 18:04:12.780543 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.780465 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-tls\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.782937 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.782904 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/76cd3b9c-5d86-460e-a147-c0872c568583-node-exporter-tls\") pod \"node-exporter-z5wsv\" (UID: \"76cd3b9c-5d86-460e-a147-c0872c568583\") " pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:12.814803 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.814768 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:12.814967 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.814871 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:12.820024 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.820005 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:12.993715 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:12.993670 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-z5wsv" Apr 16 18:04:13.575070 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:13.575037 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:18.785278 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:04:18.785233 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76cd3b9c_5d86_460e_a147_c0872c568583.slice/crio-0a6385cdc3f64c069ffe5a69b3ba38770c1641968c920072ffa879b4f329d02c WatchSource:0}: Error finding container 0a6385cdc3f64c069ffe5a69b3ba38770c1641968c920072ffa879b4f329d02c: Status 404 returned error can't find the container with id 0a6385cdc3f64c069ffe5a69b3ba38770c1641968c920072ffa879b4f329d02c Apr 16 18:04:19.590853 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:19.590804 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-9vgql" event={"ID":"e4fac77e-76ed-47ba-a915-a3fad29536cf","Type":"ContainerStarted","Data":"472f3f8603fb1f03c9c092d459f951a2aaedd303fe376bce84e40906dfa007d1"} Apr 16 18:04:19.591309 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:19.591280 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-9vgql" Apr 16 18:04:19.592243 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:19.592218 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z5wsv" event={"ID":"76cd3b9c-5d86-460e-a147-c0872c568583","Type":"ContainerStarted","Data":"0a6385cdc3f64c069ffe5a69b3ba38770c1641968c920072ffa879b4f329d02c"} Apr 16 18:04:19.602147 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:19.602123 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-9vgql" Apr 16 18:04:19.610127 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:19.610082 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-9vgql" podStartSLOduration=1.78243394 podStartE2EDuration="18.610068732s" podCreationTimestamp="2026-04-16 18:04:01 +0000 UTC" firstStartedPulling="2026-04-16 18:04:02.03903672 +0000 UTC m=+68.409710608" lastFinishedPulling="2026-04-16 18:04:18.866671501 +0000 UTC m=+85.237345400" observedRunningTime="2026-04-16 18:04:19.608774347 +0000 UTC m=+85.979448253" watchObservedRunningTime="2026-04-16 18:04:19.610068732 +0000 UTC m=+85.980742638" Apr 16 18:04:20.596888 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:20.596848 2564 generic.go:358] "Generic (PLEG): container finished" podID="76cd3b9c-5d86-460e-a147-c0872c568583" containerID="d4b71829c79f8b70f54918dd90364e07270408d958fb37c38b7cb5bf16fbc519" exitCode=0 Apr 16 18:04:20.597286 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:20.596944 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z5wsv" event={"ID":"76cd3b9c-5d86-460e-a147-c0872c568583","Type":"ContainerDied","Data":"d4b71829c79f8b70f54918dd90364e07270408d958fb37c38b7cb5bf16fbc519"} Apr 16 18:04:21.512533 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:21.512500 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:04:21.602258 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:21.602214 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z5wsv" event={"ID":"76cd3b9c-5d86-460e-a147-c0872c568583","Type":"ContainerStarted","Data":"6be60780ad19b0148a3be3d3f874b90a71f40a168b7f77d5cec86b283d440afe"} Apr 16 18:04:21.602258 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:21.602260 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z5wsv" event={"ID":"76cd3b9c-5d86-460e-a147-c0872c568583","Type":"ContainerStarted","Data":"4955c4ba0b5107d4f7f347b9c50540862b1984aebe1e8373a5aeb3d1ca9119a5"} Apr 16 18:04:21.628156 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:21.628109 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-z5wsv" podStartSLOduration=8.771783046 podStartE2EDuration="9.628089554s" podCreationTimestamp="2026-04-16 18:04:12 +0000 UTC" firstStartedPulling="2026-04-16 18:04:18.786811799 +0000 UTC m=+85.157485685" lastFinishedPulling="2026-04-16 18:04:19.643118293 +0000 UTC m=+86.013792193" observedRunningTime="2026-04-16 18:04:21.627359762 +0000 UTC m=+87.998033670" watchObservedRunningTime="2026-04-16 18:04:21.628089554 +0000 UTC m=+87.998763464" Apr 16 18:04:23.527363 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:23.527329 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6984c869b6-m8kb6" Apr 16 18:04:26.524877 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.524828 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" podUID="5b47d4c2-caca-46dd-b761-e865026dbdb9" containerName="registry" containerID="cri-o://87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2" gracePeriod=30 Apr 16 18:04:26.796169 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.796126 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:04:26.908981 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.908945 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls\") pod \"5b47d4c2-caca-46dd-b761-e865026dbdb9\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " Apr 16 18:04:26.909153 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.909004 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b47d4c2-caca-46dd-b761-e865026dbdb9-ca-trust-extracted\") pod \"5b47d4c2-caca-46dd-b761-e865026dbdb9\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " Apr 16 18:04:26.909153 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.909046 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-bound-sa-token\") pod \"5b47d4c2-caca-46dd-b761-e865026dbdb9\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " Apr 16 18:04:26.909153 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.909104 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-installation-pull-secrets\") pod \"5b47d4c2-caca-46dd-b761-e865026dbdb9\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " Apr 16 18:04:26.909153 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.909140 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-image-registry-private-configuration\") pod \"5b47d4c2-caca-46dd-b761-e865026dbdb9\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " Apr 16 18:04:26.909354 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.909189 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-trusted-ca\") pod \"5b47d4c2-caca-46dd-b761-e865026dbdb9\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " Apr 16 18:04:26.909354 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.909213 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-certificates\") pod \"5b47d4c2-caca-46dd-b761-e865026dbdb9\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " Apr 16 18:04:26.909354 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.909241 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b92kq\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-kube-api-access-b92kq\") pod \"5b47d4c2-caca-46dd-b761-e865026dbdb9\" (UID: \"5b47d4c2-caca-46dd-b761-e865026dbdb9\") " Apr 16 18:04:26.909718 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.909674 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5b47d4c2-caca-46dd-b761-e865026dbdb9" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:04:26.909857 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.909826 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5b47d4c2-caca-46dd-b761-e865026dbdb9" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:04:26.911681 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.911633 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5b47d4c2-caca-46dd-b761-e865026dbdb9" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:04:26.911789 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.911681 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5b47d4c2-caca-46dd-b761-e865026dbdb9" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:04:26.911789 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.911728 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5b47d4c2-caca-46dd-b761-e865026dbdb9" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:04:26.911993 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.911969 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-kube-api-access-b92kq" (OuterVolumeSpecName: "kube-api-access-b92kq") pod "5b47d4c2-caca-46dd-b761-e865026dbdb9" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9"). InnerVolumeSpecName "kube-api-access-b92kq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:04:26.912150 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.912125 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5b47d4c2-caca-46dd-b761-e865026dbdb9" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:04:26.920379 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:26.920351 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b47d4c2-caca-46dd-b761-e865026dbdb9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5b47d4c2-caca-46dd-b761-e865026dbdb9" (UID: "5b47d4c2-caca-46dd-b761-e865026dbdb9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:04:27.010623 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.010578 2564 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-bound-sa-token\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:27.010623 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.010605 2564 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-installation-pull-secrets\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:27.010623 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.010628 2564 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5b47d4c2-caca-46dd-b761-e865026dbdb9-image-registry-private-configuration\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:27.010885 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.010638 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-trusted-ca\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:27.010885 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.010648 2564 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-certificates\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:27.010885 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.010657 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b92kq\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-kube-api-access-b92kq\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:27.010885 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.010666 2564 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5b47d4c2-caca-46dd-b761-e865026dbdb9-registry-tls\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:27.010885 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.010678 2564 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5b47d4c2-caca-46dd-b761-e865026dbdb9-ca-trust-extracted\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:27.624573 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.624531 2564 generic.go:358] "Generic (PLEG): container finished" podID="5b47d4c2-caca-46dd-b761-e865026dbdb9" containerID="87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2" exitCode=0 Apr 16 18:04:27.625034 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.624604 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" event={"ID":"5b47d4c2-caca-46dd-b761-e865026dbdb9","Type":"ContainerDied","Data":"87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2"} Apr 16 18:04:27.625034 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.624653 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" event={"ID":"5b47d4c2-caca-46dd-b761-e865026dbdb9","Type":"ContainerDied","Data":"f0d87a45cba9d0b73a37d4b575c800e359c98f4cf68b363743f5520733e8b4ee"} Apr 16 18:04:27.625034 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.624684 2564 scope.go:117] "RemoveContainer" containerID="87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2" Apr 16 18:04:27.625034 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.624875 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d459db89c-vh5fb" Apr 16 18:04:27.638248 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.638227 2564 scope.go:117] "RemoveContainer" containerID="87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2" Apr 16 18:04:27.638556 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:04:27.638525 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2\": container with ID starting with 87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2 not found: ID does not exist" containerID="87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2" Apr 16 18:04:27.638680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.638564 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2"} err="failed to get container status \"87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2\": rpc error: code = NotFound desc = could not find container \"87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2\": container with ID starting with 87aeb85876ec9bfbc27f61981b6a36379cd44f07869d0f7f6a969bd4aa76d8d2 not found: ID does not exist" Apr 16 18:04:27.651497 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.651470 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5d459db89c-vh5fb"] Apr 16 18:04:27.658884 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:27.658862 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5d459db89c-vh5fb"] Apr 16 18:04:28.035833 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:28.035793 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b44bdbc6c-bgbcg"] Apr 16 18:04:28.193771 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:28.193730 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b47d4c2-caca-46dd-b761-e865026dbdb9" path="/var/lib/kubelet/pods/5b47d4c2-caca-46dd-b761-e865026dbdb9/volumes" Apr 16 18:04:53.059901 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.059838 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5b44bdbc6c-bgbcg" podUID="c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" containerName="console" containerID="cri-o://a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb" gracePeriod=15 Apr 16 18:04:53.329863 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.329841 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b44bdbc6c-bgbcg_c72c4323-89ec-4eb5-8ec0-5ec439d13c7e/console/0.log" Apr 16 18:04:53.329980 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.329900 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:53.399955 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.399922 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-config\") pod \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " Apr 16 18:04:53.399955 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.399960 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwsjb\" (UniqueName: \"kubernetes.io/projected/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-kube-api-access-jwsjb\") pod \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " Apr 16 18:04:53.400181 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.399980 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-oauth-config\") pod \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " Apr 16 18:04:53.400181 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.400034 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-serving-cert\") pod \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " Apr 16 18:04:53.400181 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.400073 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-service-ca\") pod \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " Apr 16 18:04:53.400181 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.400097 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-oauth-serving-cert\") pod \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\" (UID: \"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e\") " Apr 16 18:04:53.400544 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.400505 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-service-ca" (OuterVolumeSpecName: "service-ca") pod "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" (UID: "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:04:53.400544 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.400514 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-config" (OuterVolumeSpecName: "console-config") pod "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" (UID: "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:04:53.400544 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.400523 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" (UID: "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:04:53.402194 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.402170 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" (UID: "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:04:53.402277 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.402195 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" (UID: "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:04:53.402315 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.402276 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-kube-api-access-jwsjb" (OuterVolumeSpecName: "kube-api-access-jwsjb") pod "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" (UID: "c72c4323-89ec-4eb5-8ec0-5ec439d13c7e"). InnerVolumeSpecName "kube-api-access-jwsjb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:04:53.501404 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.501370 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-serving-cert\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:53.501404 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.501398 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-service-ca\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:53.501588 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.501417 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-oauth-serving-cert\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:53.501588 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.501427 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-config\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:53.501588 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.501436 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jwsjb\" (UniqueName: \"kubernetes.io/projected/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-kube-api-access-jwsjb\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:53.501588 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.501446 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e-console-oauth-config\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:04:53.694255 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.694226 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b44bdbc6c-bgbcg_c72c4323-89ec-4eb5-8ec0-5ec439d13c7e/console/0.log" Apr 16 18:04:53.694423 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.694271 2564 generic.go:358] "Generic (PLEG): container finished" podID="c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" containerID="a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb" exitCode=2 Apr 16 18:04:53.694423 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.694302 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b44bdbc6c-bgbcg" event={"ID":"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e","Type":"ContainerDied","Data":"a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb"} Apr 16 18:04:53.694423 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.694342 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b44bdbc6c-bgbcg" event={"ID":"c72c4323-89ec-4eb5-8ec0-5ec439d13c7e","Type":"ContainerDied","Data":"33c7dab5aa1fa749e137c7e6518af420fdb4d8387cce862082e8e8fb41e2aced"} Apr 16 18:04:53.694423 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.694343 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b44bdbc6c-bgbcg" Apr 16 18:04:53.694423 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.694357 2564 scope.go:117] "RemoveContainer" containerID="a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb" Apr 16 18:04:53.702519 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.702511 2564 scope.go:117] "RemoveContainer" containerID="a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb" Apr 16 18:04:53.702796 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:04:53.702776 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb\": container with ID starting with a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb not found: ID does not exist" containerID="a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb" Apr 16 18:04:53.702872 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.702804 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb"} err="failed to get container status \"a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb\": rpc error: code = NotFound desc = could not find container \"a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb\": container with ID starting with a9bcfd1ca7bf0f5e4e374bef1440f0cec10164763689b9afed3fdf79000a39fb not found: ID does not exist" Apr 16 18:04:53.723005 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.722973 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b44bdbc6c-bgbcg"] Apr 16 18:04:53.730918 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:53.730897 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b44bdbc6c-bgbcg"] Apr 16 18:04:54.192956 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:54.192920 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" path="/var/lib/kubelet/pods/c72c4323-89ec-4eb5-8ec0-5ec439d13c7e/volumes" Apr 16 18:04:58.610001 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:58.609975 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z5wsv_76cd3b9c-5d86-460e-a147-c0872c568583/init-textfile/0.log" Apr 16 18:04:58.811797 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:58.811752 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z5wsv_76cd3b9c-5d86-460e-a147-c0872c568583/node-exporter/0.log" Apr 16 18:04:59.012149 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:04:59.012120 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z5wsv_76cd3b9c-5d86-460e-a147-c0872c568583/kube-rbac-proxy/0.log" Apr 16 18:05:01.214412 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:01.214380 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-lbk7p_d929efe8-7163-4d2b-96b5-d313e3e8d987/prometheus-operator/0.log" Apr 16 18:05:01.411050 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:01.411008 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-lbk7p_d929efe8-7163-4d2b-96b5-d313e3e8d987/kube-rbac-proxy/0.log" Apr 16 18:05:01.611004 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:01.610921 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-h49fn_41432f2c-9579-47bd-9382-9700b101f44d/prometheus-operator-admission-webhook/0.log" Apr 16 18:05:03.010328 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:03.010297 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-pffzm_4aea7302-70a3-484c-bdf8-4a745df33639/networking-console-plugin/0.log" Apr 16 18:05:03.811917 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:03.811884 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-9vgql_e4fac77e-76ed-47ba-a915-a3fad29536cf/download-server/0.log" Apr 16 18:05:04.210454 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:04.210431 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2htvb_5a9c1147-51d3-4765-a356-5a32f7bc0b67/serve-healthcheck-canary/0.log" Apr 16 18:05:09.861234 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.861199 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76b678f8c-w4zwk"] Apr 16 18:05:09.861741 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.861452 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b47d4c2-caca-46dd-b761-e865026dbdb9" containerName="registry" Apr 16 18:05:09.861741 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.861463 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b47d4c2-caca-46dd-b761-e865026dbdb9" containerName="registry" Apr 16 18:05:09.861741 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.861478 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" containerName="console" Apr 16 18:05:09.861741 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.861484 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" containerName="console" Apr 16 18:05:09.861741 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.861521 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b47d4c2-caca-46dd-b761-e865026dbdb9" containerName="registry" Apr 16 18:05:09.861741 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.861530 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="c72c4323-89ec-4eb5-8ec0-5ec439d13c7e" containerName="console" Apr 16 18:05:09.864477 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.864459 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:09.868064 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.868042 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:05:09.868174 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.868048 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:05:09.868174 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.868120 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bxlzl\"" Apr 16 18:05:09.869824 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.869806 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:05:09.869961 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.869917 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:05:09.870030 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.869977 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:05:09.874115 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.874094 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:05:09.878217 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.878198 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76b678f8c-w4zwk"] Apr 16 18:05:09.927016 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.926986 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-service-ca\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:09.927016 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.927019 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-oauth-config\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:09.927202 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.927044 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-serving-cert\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:09.927202 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.927097 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-config\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:09.927202 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.927131 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-oauth-serving-cert\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:09.927295 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.927193 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qxjq\" (UniqueName: \"kubernetes.io/projected/4be91063-338a-4590-91c6-e7f1f24f8f5c-kube-api-access-7qxjq\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:09.927295 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:09.927224 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-trusted-ca-bundle\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.028449 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.028405 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qxjq\" (UniqueName: \"kubernetes.io/projected/4be91063-338a-4590-91c6-e7f1f24f8f5c-kube-api-access-7qxjq\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.028449 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.028453 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-trusted-ca-bundle\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.028687 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.028493 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-service-ca\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.028687 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.028516 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-oauth-config\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.028687 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.028546 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-serving-cert\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.028687 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.028580 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-config\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.028879 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.028729 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-oauth-serving-cert\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.029280 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.029252 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-config\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.029397 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.029313 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-service-ca\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.029440 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.029406 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-oauth-serving-cert\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.029582 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.029563 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-trusted-ca-bundle\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.031919 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.031881 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-oauth-config\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.032033 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.031929 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-serving-cert\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.039747 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.039726 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qxjq\" (UniqueName: \"kubernetes.io/projected/4be91063-338a-4590-91c6-e7f1f24f8f5c-kube-api-access-7qxjq\") pod \"console-76b678f8c-w4zwk\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.174009 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.173910 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:10.319547 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.319514 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76b678f8c-w4zwk"] Apr 16 18:05:10.327192 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:05:10.327159 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4be91063_338a_4590_91c6_e7f1f24f8f5c.slice/crio-83064644ef322abb92ebde905e6088727d61740e6dfd3d2a8da28f23b529c52e WatchSource:0}: Error finding container 83064644ef322abb92ebde905e6088727d61740e6dfd3d2a8da28f23b529c52e: Status 404 returned error can't find the container with id 83064644ef322abb92ebde905e6088727d61740e6dfd3d2a8da28f23b529c52e Apr 16 18:05:10.746890 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.746851 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b678f8c-w4zwk" event={"ID":"4be91063-338a-4590-91c6-e7f1f24f8f5c","Type":"ContainerStarted","Data":"540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff"} Apr 16 18:05:10.746890 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.746892 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b678f8c-w4zwk" event={"ID":"4be91063-338a-4590-91c6-e7f1f24f8f5c","Type":"ContainerStarted","Data":"83064644ef322abb92ebde905e6088727d61740e6dfd3d2a8da28f23b529c52e"} Apr 16 18:05:10.779068 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:10.779017 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76b678f8c-w4zwk" podStartSLOduration=1.779001204 podStartE2EDuration="1.779001204s" podCreationTimestamp="2026-04-16 18:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:05:10.778698417 +0000 UTC m=+137.149372325" watchObservedRunningTime="2026-04-16 18:05:10.779001204 +0000 UTC m=+137.149675110" Apr 16 18:05:20.174832 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:20.174770 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:20.174832 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:20.174836 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:20.179501 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:20.179475 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:05:20.775830 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:05:20.775802 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:06:53.375747 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.375713 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c6bf65995-f66zj"] Apr 16 18:06:53.377622 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.377594 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.395138 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.395108 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6bf65995-f66zj"] Apr 16 18:06:53.484430 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.484400 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-oauth-config\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.484430 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.484436 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-oauth-serving-cert\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.484680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.484459 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-serving-cert\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.484680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.484532 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-console-config\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.484680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.484589 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v7l6\" (UniqueName: \"kubernetes.io/projected/8163b196-f081-4019-94b4-c6f33d33e641-kube-api-access-7v7l6\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.484680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.484620 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-trusted-ca-bundle\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.484680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.484663 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-service-ca\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.585453 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.585420 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-console-config\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.585683 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.585469 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7v7l6\" (UniqueName: \"kubernetes.io/projected/8163b196-f081-4019-94b4-c6f33d33e641-kube-api-access-7v7l6\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.585683 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.585494 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-trusted-ca-bundle\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.585683 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.585534 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-service-ca\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.585683 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.585567 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-oauth-config\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.585902 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.585700 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-oauth-serving-cert\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.585902 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.585751 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-serving-cert\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.586296 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.586272 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-service-ca\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.586382 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.586277 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-console-config\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.586434 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.586377 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-oauth-serving-cert\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.586434 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.586410 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-trusted-ca-bundle\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.588105 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.588086 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-serving-cert\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.588254 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.588232 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-oauth-config\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.593658 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.593638 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v7l6\" (UniqueName: \"kubernetes.io/projected/8163b196-f081-4019-94b4-c6f33d33e641-kube-api-access-7v7l6\") pod \"console-c6bf65995-f66zj\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.686536 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.686447 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:06:53.804166 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:53.804134 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6bf65995-f66zj"] Apr 16 18:06:53.807096 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:06:53.807069 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8163b196_f081_4019_94b4_c6f33d33e641.slice/crio-cb25139db46f33270659982a572292b3eeeac8400c083190574f7306ca162dfa WatchSource:0}: Error finding container cb25139db46f33270659982a572292b3eeeac8400c083190574f7306ca162dfa: Status 404 returned error can't find the container with id cb25139db46f33270659982a572292b3eeeac8400c083190574f7306ca162dfa Apr 16 18:06:54.021907 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:54.021872 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6bf65995-f66zj" event={"ID":"8163b196-f081-4019-94b4-c6f33d33e641","Type":"ContainerStarted","Data":"d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9"} Apr 16 18:06:54.021907 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:54.021909 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6bf65995-f66zj" event={"ID":"8163b196-f081-4019-94b4-c6f33d33e641","Type":"ContainerStarted","Data":"cb25139db46f33270659982a572292b3eeeac8400c083190574f7306ca162dfa"} Apr 16 18:06:54.042291 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:06:54.042249 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c6bf65995-f66zj" podStartSLOduration=1.042235242 podStartE2EDuration="1.042235242s" podCreationTimestamp="2026-04-16 18:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:06:54.04083988 +0000 UTC m=+240.411513787" watchObservedRunningTime="2026-04-16 18:06:54.042235242 +0000 UTC m=+240.412909150" Apr 16 18:07:03.686770 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:03.686732 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:07:03.686770 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:03.686781 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:07:03.691707 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:03.691685 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:07:04.051126 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:04.051099 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:07:04.112861 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:04.112830 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76b678f8c-w4zwk"] Apr 16 18:07:29.131293 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.131249 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76b678f8c-w4zwk" podUID="4be91063-338a-4590-91c6-e7f1f24f8f5c" containerName="console" containerID="cri-o://540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff" gracePeriod=15 Apr 16 18:07:29.364475 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.364449 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76b678f8c-w4zwk_4be91063-338a-4590-91c6-e7f1f24f8f5c/console/0.log" Apr 16 18:07:29.364603 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.364525 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:07:29.463250 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463222 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-oauth-serving-cert\") pod \"4be91063-338a-4590-91c6-e7f1f24f8f5c\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " Apr 16 18:07:29.463388 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463260 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-service-ca\") pod \"4be91063-338a-4590-91c6-e7f1f24f8f5c\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " Apr 16 18:07:29.463388 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463289 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-oauth-config\") pod \"4be91063-338a-4590-91c6-e7f1f24f8f5c\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " Apr 16 18:07:29.463388 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463319 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qxjq\" (UniqueName: \"kubernetes.io/projected/4be91063-338a-4590-91c6-e7f1f24f8f5c-kube-api-access-7qxjq\") pod \"4be91063-338a-4590-91c6-e7f1f24f8f5c\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " Apr 16 18:07:29.463515 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463465 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-trusted-ca-bundle\") pod \"4be91063-338a-4590-91c6-e7f1f24f8f5c\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " Apr 16 18:07:29.463570 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463525 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-config\") pod \"4be91063-338a-4590-91c6-e7f1f24f8f5c\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " Apr 16 18:07:29.463645 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463584 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-serving-cert\") pod \"4be91063-338a-4590-91c6-e7f1f24f8f5c\" (UID: \"4be91063-338a-4590-91c6-e7f1f24f8f5c\") " Apr 16 18:07:29.463706 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463658 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4be91063-338a-4590-91c6-e7f1f24f8f5c" (UID: "4be91063-338a-4590-91c6-e7f1f24f8f5c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:29.463966 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463750 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-service-ca" (OuterVolumeSpecName: "service-ca") pod "4be91063-338a-4590-91c6-e7f1f24f8f5c" (UID: "4be91063-338a-4590-91c6-e7f1f24f8f5c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:29.463966 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463925 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-config" (OuterVolumeSpecName: "console-config") pod "4be91063-338a-4590-91c6-e7f1f24f8f5c" (UID: "4be91063-338a-4590-91c6-e7f1f24f8f5c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:29.463966 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463922 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4be91063-338a-4590-91c6-e7f1f24f8f5c" (UID: "4be91063-338a-4590-91c6-e7f1f24f8f5c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:29.463966 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463952 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-oauth-serving-cert\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:07:29.463966 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.463972 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-service-ca\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:07:29.465646 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.465599 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4be91063-338a-4590-91c6-e7f1f24f8f5c" (UID: "4be91063-338a-4590-91c6-e7f1f24f8f5c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:07:29.465989 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.465967 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be91063-338a-4590-91c6-e7f1f24f8f5c-kube-api-access-7qxjq" (OuterVolumeSpecName: "kube-api-access-7qxjq") pod "4be91063-338a-4590-91c6-e7f1f24f8f5c" (UID: "4be91063-338a-4590-91c6-e7f1f24f8f5c"). InnerVolumeSpecName "kube-api-access-7qxjq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:07:29.465989 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.465974 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4be91063-338a-4590-91c6-e7f1f24f8f5c" (UID: "4be91063-338a-4590-91c6-e7f1f24f8f5c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:07:29.564319 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.564269 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qxjq\" (UniqueName: \"kubernetes.io/projected/4be91063-338a-4590-91c6-e7f1f24f8f5c-kube-api-access-7qxjq\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:07:29.564319 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.564313 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-trusted-ca-bundle\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:07:29.564319 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.564328 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-config\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:07:29.564537 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.564341 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-serving-cert\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:07:29.564537 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:29.564356 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4be91063-338a-4590-91c6-e7f1f24f8f5c-console-oauth-config\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:07:30.120241 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:30.120157 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76b678f8c-w4zwk_4be91063-338a-4590-91c6-e7f1f24f8f5c/console/0.log" Apr 16 18:07:30.120241 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:30.120194 2564 generic.go:358] "Generic (PLEG): container finished" podID="4be91063-338a-4590-91c6-e7f1f24f8f5c" containerID="540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff" exitCode=2 Apr 16 18:07:30.120452 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:30.120242 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b678f8c-w4zwk" event={"ID":"4be91063-338a-4590-91c6-e7f1f24f8f5c","Type":"ContainerDied","Data":"540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff"} Apr 16 18:07:30.120452 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:30.120264 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b678f8c-w4zwk" event={"ID":"4be91063-338a-4590-91c6-e7f1f24f8f5c","Type":"ContainerDied","Data":"83064644ef322abb92ebde905e6088727d61740e6dfd3d2a8da28f23b529c52e"} Apr 16 18:07:30.120452 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:30.120288 2564 scope.go:117] "RemoveContainer" containerID="540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff" Apr 16 18:07:30.120452 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:30.120299 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b678f8c-w4zwk" Apr 16 18:07:30.128510 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:30.128419 2564 scope.go:117] "RemoveContainer" containerID="540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff" Apr 16 18:07:30.128765 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:07:30.128738 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff\": container with ID starting with 540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff not found: ID does not exist" containerID="540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff" Apr 16 18:07:30.128851 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:30.128777 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff"} err="failed to get container status \"540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff\": rpc error: code = NotFound desc = could not find container \"540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff\": container with ID starting with 540116d6d83eb17826a18559b94c6f6fd8a9cd0f0ef5d78496f8bf133a8532ff not found: ID does not exist" Apr 16 18:07:30.145878 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:30.145853 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76b678f8c-w4zwk"] Apr 16 18:07:30.154969 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:30.154948 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76b678f8c-w4zwk"] Apr 16 18:07:30.191869 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:30.191841 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be91063-338a-4590-91c6-e7f1f24f8f5c" path="/var/lib/kubelet/pods/4be91063-338a-4590-91c6-e7f1f24f8f5c/volumes" Apr 16 18:07:54.063663 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:54.063630 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:07:54.064195 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:54.063638 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:07:54.072070 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:07:54.071940 2564 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:08:45.069240 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.069164 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm"] Apr 16 18:08:45.069714 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.069487 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4be91063-338a-4590-91c6-e7f1f24f8f5c" containerName="console" Apr 16 18:08:45.069714 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.069502 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be91063-338a-4590-91c6-e7f1f24f8f5c" containerName="console" Apr 16 18:08:45.069714 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.069557 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4be91063-338a-4590-91c6-e7f1f24f8f5c" containerName="console" Apr 16 18:08:45.071335 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.071320 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:45.074159 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.074133 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:08:45.074265 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.074171 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:08:45.074265 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.074133 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vjzm2\"" Apr 16 18:08:45.083222 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.083201 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm"] Apr 16 18:08:45.117113 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.117077 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:45.117113 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.117116 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rld68\" (UniqueName: \"kubernetes.io/projected/632c0959-6046-4b82-9aba-4a1c7300406c-kube-api-access-rld68\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:45.117332 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.117144 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:45.217870 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.217835 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rld68\" (UniqueName: \"kubernetes.io/projected/632c0959-6046-4b82-9aba-4a1c7300406c-kube-api-access-rld68\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:45.218042 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.217879 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:45.218042 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.217938 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:45.218274 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.218254 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:45.218327 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.218290 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:45.226983 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.226956 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rld68\" (UniqueName: \"kubernetes.io/projected/632c0959-6046-4b82-9aba-4a1c7300406c-kube-api-access-rld68\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:45.380922 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.380839 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:45.500237 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.500214 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm"] Apr 16 18:08:45.502091 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:08:45.502056 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod632c0959_6046_4b82_9aba_4a1c7300406c.slice/crio-77271cc95d412ce3d46fedd8109abc2b356a330470022a9238af467b0df4cbd7 WatchSource:0}: Error finding container 77271cc95d412ce3d46fedd8109abc2b356a330470022a9238af467b0df4cbd7: Status 404 returned error can't find the container with id 77271cc95d412ce3d46fedd8109abc2b356a330470022a9238af467b0df4cbd7 Apr 16 18:08:45.503937 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:45.503921 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:08:46.319847 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:46.319807 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" event={"ID":"632c0959-6046-4b82-9aba-4a1c7300406c","Type":"ContainerStarted","Data":"77271cc95d412ce3d46fedd8109abc2b356a330470022a9238af467b0df4cbd7"} Apr 16 18:08:50.332048 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:50.332013 2564 generic.go:358] "Generic (PLEG): container finished" podID="632c0959-6046-4b82-9aba-4a1c7300406c" containerID="cb4695583e52f0f6cf9f0e118898f0731c41a437f6d4d6f43b936c8fd036c199" exitCode=0 Apr 16 18:08:50.332397 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:50.332099 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" event={"ID":"632c0959-6046-4b82-9aba-4a1c7300406c","Type":"ContainerDied","Data":"cb4695583e52f0f6cf9f0e118898f0731c41a437f6d4d6f43b936c8fd036c199"} Apr 16 18:08:52.340076 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:52.340044 2564 generic.go:358] "Generic (PLEG): container finished" podID="632c0959-6046-4b82-9aba-4a1c7300406c" containerID="b429b4e40d6c811961e365c18353385a4ededc86856904b36240c3aadad2392f" exitCode=0 Apr 16 18:08:52.340433 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:52.340104 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" event={"ID":"632c0959-6046-4b82-9aba-4a1c7300406c","Type":"ContainerDied","Data":"b429b4e40d6c811961e365c18353385a4ededc86856904b36240c3aadad2392f"} Apr 16 18:08:58.359675 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:58.359644 2564 generic.go:358] "Generic (PLEG): container finished" podID="632c0959-6046-4b82-9aba-4a1c7300406c" containerID="82abedd565ef9e9bbc71dcc079930ec099d25bf24dd30b722ca5c2ad909db86c" exitCode=0 Apr 16 18:08:58.360036 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:58.359696 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" event={"ID":"632c0959-6046-4b82-9aba-4a1c7300406c","Type":"ContainerDied","Data":"82abedd565ef9e9bbc71dcc079930ec099d25bf24dd30b722ca5c2ad909db86c"} Apr 16 18:08:59.476236 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:59.476215 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:08:59.538831 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:59.538801 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-bundle\") pod \"632c0959-6046-4b82-9aba-4a1c7300406c\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " Apr 16 18:08:59.538995 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:59.538858 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-util\") pod \"632c0959-6046-4b82-9aba-4a1c7300406c\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " Apr 16 18:08:59.538995 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:59.538903 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rld68\" (UniqueName: \"kubernetes.io/projected/632c0959-6046-4b82-9aba-4a1c7300406c-kube-api-access-rld68\") pod \"632c0959-6046-4b82-9aba-4a1c7300406c\" (UID: \"632c0959-6046-4b82-9aba-4a1c7300406c\") " Apr 16 18:08:59.539352 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:59.539328 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-bundle" (OuterVolumeSpecName: "bundle") pod "632c0959-6046-4b82-9aba-4a1c7300406c" (UID: "632c0959-6046-4b82-9aba-4a1c7300406c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:59.541097 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:59.541069 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632c0959-6046-4b82-9aba-4a1c7300406c-kube-api-access-rld68" (OuterVolumeSpecName: "kube-api-access-rld68") pod "632c0959-6046-4b82-9aba-4a1c7300406c" (UID: "632c0959-6046-4b82-9aba-4a1c7300406c"). InnerVolumeSpecName "kube-api-access-rld68". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:08:59.544006 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:59.543979 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-util" (OuterVolumeSpecName: "util") pod "632c0959-6046-4b82-9aba-4a1c7300406c" (UID: "632c0959-6046-4b82-9aba-4a1c7300406c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:59.639846 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:59.639760 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-bundle\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:08:59.639846 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:59.639793 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/632c0959-6046-4b82-9aba-4a1c7300406c-util\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:08:59.639846 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:08:59.639806 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rld68\" (UniqueName: \"kubernetes.io/projected/632c0959-6046-4b82-9aba-4a1c7300406c-kube-api-access-rld68\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:09:00.366887 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:00.366857 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" Apr 16 18:09:00.367070 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:00.366856 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2skpm" event={"ID":"632c0959-6046-4b82-9aba-4a1c7300406c","Type":"ContainerDied","Data":"77271cc95d412ce3d46fedd8109abc2b356a330470022a9238af467b0df4cbd7"} Apr 16 18:09:00.367070 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:00.366963 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77271cc95d412ce3d46fedd8109abc2b356a330470022a9238af467b0df4cbd7" Apr 16 18:09:07.105579 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.105544 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9"] Apr 16 18:09:07.106063 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.105805 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="632c0959-6046-4b82-9aba-4a1c7300406c" containerName="extract" Apr 16 18:09:07.106063 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.105816 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="632c0959-6046-4b82-9aba-4a1c7300406c" containerName="extract" Apr 16 18:09:07.106063 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.105825 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="632c0959-6046-4b82-9aba-4a1c7300406c" containerName="pull" Apr 16 18:09:07.106063 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.105831 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="632c0959-6046-4b82-9aba-4a1c7300406c" containerName="pull" Apr 16 18:09:07.106063 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.105845 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="632c0959-6046-4b82-9aba-4a1c7300406c" containerName="util" Apr 16 18:09:07.106063 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.105850 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="632c0959-6046-4b82-9aba-4a1c7300406c" containerName="util" Apr 16 18:09:07.106063 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.105906 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="632c0959-6046-4b82-9aba-4a1c7300406c" containerName="extract" Apr 16 18:09:07.112324 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.112307 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" Apr 16 18:09:07.115856 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.115832 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:09:07.116059 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.116040 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-ccf97\"" Apr 16 18:09:07.116158 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.116102 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:09:07.116556 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.116543 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:09:07.133065 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.133042 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9"] Apr 16 18:09:07.197085 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.197048 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/aea37cc6-303d-4941-8182-debdb9142d71-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96jx9\" (UID: \"aea37cc6-303d-4941-8182-debdb9142d71\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" Apr 16 18:09:07.197251 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.197096 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pnzs\" (UniqueName: \"kubernetes.io/projected/aea37cc6-303d-4941-8182-debdb9142d71-kube-api-access-4pnzs\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96jx9\" (UID: \"aea37cc6-303d-4941-8182-debdb9142d71\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" Apr 16 18:09:07.298260 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.298224 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/aea37cc6-303d-4941-8182-debdb9142d71-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96jx9\" (UID: \"aea37cc6-303d-4941-8182-debdb9142d71\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" Apr 16 18:09:07.298381 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.298291 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pnzs\" (UniqueName: \"kubernetes.io/projected/aea37cc6-303d-4941-8182-debdb9142d71-kube-api-access-4pnzs\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96jx9\" (UID: \"aea37cc6-303d-4941-8182-debdb9142d71\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" Apr 16 18:09:07.300603 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.300573 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/aea37cc6-303d-4941-8182-debdb9142d71-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96jx9\" (UID: \"aea37cc6-303d-4941-8182-debdb9142d71\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" Apr 16 18:09:07.307208 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.307180 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pnzs\" (UniqueName: \"kubernetes.io/projected/aea37cc6-303d-4941-8182-debdb9142d71-kube-api-access-4pnzs\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-96jx9\" (UID: \"aea37cc6-303d-4941-8182-debdb9142d71\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" Apr 16 18:09:07.421532 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.421433 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" Apr 16 18:09:07.543323 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:07.543301 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9"] Apr 16 18:09:07.545650 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:09:07.545625 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaea37cc6_303d_4941_8182_debdb9142d71.slice/crio-662caa4e701015f1d93dc00230cb20864b4cb6c91db00e727fe7a2bf154c626c WatchSource:0}: Error finding container 662caa4e701015f1d93dc00230cb20864b4cb6c91db00e727fe7a2bf154c626c: Status 404 returned error can't find the container with id 662caa4e701015f1d93dc00230cb20864b4cb6c91db00e727fe7a2bf154c626c Apr 16 18:09:08.389900 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:08.389856 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" event={"ID":"aea37cc6-303d-4941-8182-debdb9142d71","Type":"ContainerStarted","Data":"662caa4e701015f1d93dc00230cb20864b4cb6c91db00e727fe7a2bf154c626c"} Apr 16 18:09:11.229219 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.229184 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-f8mdk"] Apr 16 18:09:11.232305 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.232281 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:11.234862 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.234840 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:09:11.234862 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.234852 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-59gck\"" Apr 16 18:09:11.235012 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.234842 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:09:11.241139 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.241116 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-f8mdk"] Apr 16 18:09:11.333619 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.333587 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:11.333808 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.333649 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2p98\" (UniqueName: \"kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-kube-api-access-x2p98\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:11.333808 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.333745 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/47a53de2-a20e-485a-b17c-c9b1ea16e963-cabundle0\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:11.401654 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.401596 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" event={"ID":"aea37cc6-303d-4941-8182-debdb9142d71","Type":"ContainerStarted","Data":"6441bf364b09c0eb6c375d1a6f984e9bf063b0c83a0d3560ddb44086080306e9"} Apr 16 18:09:11.401826 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.401725 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" Apr 16 18:09:11.423497 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.423446 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" podStartSLOduration=1.277273145 podStartE2EDuration="4.423427491s" podCreationTimestamp="2026-04-16 18:09:07 +0000 UTC" firstStartedPulling="2026-04-16 18:09:07.54739574 +0000 UTC m=+373.918069626" lastFinishedPulling="2026-04-16 18:09:10.693550083 +0000 UTC m=+377.064223972" observedRunningTime="2026-04-16 18:09:11.422748533 +0000 UTC m=+377.793422439" watchObservedRunningTime="2026-04-16 18:09:11.423427491 +0000 UTC m=+377.794101399" Apr 16 18:09:11.434188 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.434152 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2p98\" (UniqueName: \"kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-kube-api-access-x2p98\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:11.434355 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.434205 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/47a53de2-a20e-485a-b17c-c9b1ea16e963-cabundle0\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:11.434355 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.434279 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:11.434465 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.434397 2564 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 18:09:11.434465 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.434415 2564 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:09:11.434465 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.434426 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:09:11.434465 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.434441 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f8mdk: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 18:09:11.434695 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.434505 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates podName:47a53de2-a20e-485a-b17c-c9b1ea16e963 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:11.934485747 +0000 UTC m=+378.305159640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates") pod "keda-operator-ffbb595cb-f8mdk" (UID: "47a53de2-a20e-485a-b17c-c9b1ea16e963") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 18:09:11.435530 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.435325 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/47a53de2-a20e-485a-b17c-c9b1ea16e963-cabundle0\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:11.446176 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.446151 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2p98\" (UniqueName: \"kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-kube-api-access-x2p98\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:11.506934 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.506761 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n"] Apr 16 18:09:11.510366 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.510343 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:11.513587 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.513567 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:09:11.540600 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.540576 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n"] Apr 16 18:09:11.636456 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.636410 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:11.636632 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.636476 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxrc2\" (UniqueName: \"kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-kube-api-access-mxrc2\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:11.636632 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.636586 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:11.699153 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.699113 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-78ncd"] Apr 16 18:09:11.702534 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.702509 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-78ncd" Apr 16 18:09:11.705054 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.705034 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:09:11.711276 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.711256 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-78ncd"] Apr 16 18:09:11.737198 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.737173 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxrc2\" (UniqueName: \"kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-kube-api-access-mxrc2\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:11.737334 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.737216 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:11.737334 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.737283 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:11.737434 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.737423 2564 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:09:11.737488 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.737443 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:09:11.737488 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.737477 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n: references non-existent secret key: tls.crt Apr 16 18:09:11.737584 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.737556 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates podName:30fca5e7-9feb-49e8-bda1-9054c60cc8f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:12.237537421 +0000 UTC m=+378.608211310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates") pod "keda-metrics-apiserver-7c9f485588-kvp2n" (UID: "30fca5e7-9feb-49e8-bda1-9054c60cc8f5") : references non-existent secret key: tls.crt Apr 16 18:09:11.737662 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.737605 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:11.748408 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.748381 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxrc2\" (UniqueName: \"kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-kube-api-access-mxrc2\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:11.837782 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.837676 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ff91b178-6325-45e5-82d2-3ece72ef4d82-certificates\") pod \"keda-admission-cf49989db-78ncd\" (UID: \"ff91b178-6325-45e5-82d2-3ece72ef4d82\") " pod="openshift-keda/keda-admission-cf49989db-78ncd" Apr 16 18:09:11.837782 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.837743 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcs2x\" (UniqueName: \"kubernetes.io/projected/ff91b178-6325-45e5-82d2-3ece72ef4d82-kube-api-access-hcs2x\") pod \"keda-admission-cf49989db-78ncd\" (UID: \"ff91b178-6325-45e5-82d2-3ece72ef4d82\") " pod="openshift-keda/keda-admission-cf49989db-78ncd" Apr 16 18:09:11.938274 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.938221 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:11.938457 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.938369 2564 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:09:11.938457 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.938385 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ff91b178-6325-45e5-82d2-3ece72ef4d82-certificates\") pod \"keda-admission-cf49989db-78ncd\" (UID: \"ff91b178-6325-45e5-82d2-3ece72ef4d82\") " pod="openshift-keda/keda-admission-cf49989db-78ncd" Apr 16 18:09:11.938457 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.938433 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcs2x\" (UniqueName: \"kubernetes.io/projected/ff91b178-6325-45e5-82d2-3ece72ef4d82-kube-api-access-hcs2x\") pod \"keda-admission-cf49989db-78ncd\" (UID: \"ff91b178-6325-45e5-82d2-3ece72ef4d82\") " pod="openshift-keda/keda-admission-cf49989db-78ncd" Apr 16 18:09:11.938457 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.938390 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:09:11.938457 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.938459 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f8mdk: references non-existent secret key: ca.crt Apr 16 18:09:11.938728 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:11.938521 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates podName:47a53de2-a20e-485a-b17c-c9b1ea16e963 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:12.93850238 +0000 UTC m=+379.309176268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates") pod "keda-operator-ffbb595cb-f8mdk" (UID: "47a53de2-a20e-485a-b17c-c9b1ea16e963") : references non-existent secret key: ca.crt Apr 16 18:09:11.940770 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.940746 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ff91b178-6325-45e5-82d2-3ece72ef4d82-certificates\") pod \"keda-admission-cf49989db-78ncd\" (UID: \"ff91b178-6325-45e5-82d2-3ece72ef4d82\") " pod="openshift-keda/keda-admission-cf49989db-78ncd" Apr 16 18:09:11.950019 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:11.949991 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcs2x\" (UniqueName: \"kubernetes.io/projected/ff91b178-6325-45e5-82d2-3ece72ef4d82-kube-api-access-hcs2x\") pod \"keda-admission-cf49989db-78ncd\" (UID: \"ff91b178-6325-45e5-82d2-3ece72ef4d82\") " pod="openshift-keda/keda-admission-cf49989db-78ncd" Apr 16 18:09:12.012572 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:12.012528 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-78ncd" Apr 16 18:09:12.164454 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:12.164420 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-78ncd"] Apr 16 18:09:12.168043 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:09:12.168017 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff91b178_6325_45e5_82d2_3ece72ef4d82.slice/crio-a14d378f4e61e1cda6e8e3513b7a94dfc433635ba9d6543bd9ebae3fd635944c WatchSource:0}: Error finding container a14d378f4e61e1cda6e8e3513b7a94dfc433635ba9d6543bd9ebae3fd635944c: Status 404 returned error can't find the container with id a14d378f4e61e1cda6e8e3513b7a94dfc433635ba9d6543bd9ebae3fd635944c Apr 16 18:09:12.241504 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:12.241462 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:12.241972 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:12.241575 2564 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:09:12.241972 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:12.241591 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:09:12.241972 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:12.241622 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n: references non-existent secret key: tls.crt Apr 16 18:09:12.241972 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:12.241677 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates podName:30fca5e7-9feb-49e8-bda1-9054c60cc8f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:13.241657291 +0000 UTC m=+379.612331179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates") pod "keda-metrics-apiserver-7c9f485588-kvp2n" (UID: "30fca5e7-9feb-49e8-bda1-9054c60cc8f5") : references non-existent secret key: tls.crt Apr 16 18:09:12.407433 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:12.407341 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-78ncd" event={"ID":"ff91b178-6325-45e5-82d2-3ece72ef4d82","Type":"ContainerStarted","Data":"a14d378f4e61e1cda6e8e3513b7a94dfc433635ba9d6543bd9ebae3fd635944c"} Apr 16 18:09:12.948667 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:12.948631 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:12.948860 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:12.948783 2564 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:09:12.948860 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:12.948807 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:09:12.948860 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:12.948819 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f8mdk: references non-existent secret key: ca.crt Apr 16 18:09:12.949005 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:12.948885 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates podName:47a53de2-a20e-485a-b17c-c9b1ea16e963 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:14.948867033 +0000 UTC m=+381.319540941 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates") pod "keda-operator-ffbb595cb-f8mdk" (UID: "47a53de2-a20e-485a-b17c-c9b1ea16e963") : references non-existent secret key: ca.crt Apr 16 18:09:13.251587 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:13.251569 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:13.251853 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:13.251731 2564 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:09:13.251853 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:13.251753 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:09:13.251853 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:13.251780 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n: references non-existent secret key: tls.crt Apr 16 18:09:13.251853 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:13.251847 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates podName:30fca5e7-9feb-49e8-bda1-9054c60cc8f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:15.25182607 +0000 UTC m=+381.622499956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates") pod "keda-metrics-apiserver-7c9f485588-kvp2n" (UID: "30fca5e7-9feb-49e8-bda1-9054c60cc8f5") : references non-existent secret key: tls.crt Apr 16 18:09:13.411411 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:13.411375 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-78ncd" event={"ID":"ff91b178-6325-45e5-82d2-3ece72ef4d82","Type":"ContainerStarted","Data":"2cf9c9f5f1d8aa93bee2ca547e7ebfafdd28cfe8dd712130838be17754a88586"} Apr 16 18:09:13.411547 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:13.411508 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-78ncd" Apr 16 18:09:13.427920 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:13.427870 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-78ncd" podStartSLOduration=1.357388291 podStartE2EDuration="2.427824932s" podCreationTimestamp="2026-04-16 18:09:11 +0000 UTC" firstStartedPulling="2026-04-16 18:09:12.169713724 +0000 UTC m=+378.540387609" lastFinishedPulling="2026-04-16 18:09:13.240150351 +0000 UTC m=+379.610824250" observedRunningTime="2026-04-16 18:09:13.427676085 +0000 UTC m=+379.798349996" watchObservedRunningTime="2026-04-16 18:09:13.427824932 +0000 UTC m=+379.798498838" Apr 16 18:09:14.963743 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:14.963703 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:14.964097 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:14.963862 2564 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:09:14.964097 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:14.963882 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:09:14.964097 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:14.963892 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f8mdk: references non-existent secret key: ca.crt Apr 16 18:09:14.964097 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:14.963945 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates podName:47a53de2-a20e-485a-b17c-c9b1ea16e963 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:18.963931015 +0000 UTC m=+385.334604900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates") pod "keda-operator-ffbb595cb-f8mdk" (UID: "47a53de2-a20e-485a-b17c-c9b1ea16e963") : references non-existent secret key: ca.crt Apr 16 18:09:15.265929 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:15.265849 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:15.266056 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:15.265974 2564 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:09:15.266056 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:15.265986 2564 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:09:15.266056 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:15.266004 2564 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n: references non-existent secret key: tls.crt Apr 16 18:09:15.266056 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:09:15.266052 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates podName:30fca5e7-9feb-49e8-bda1-9054c60cc8f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:09:19.266040204 +0000 UTC m=+385.636714089 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates") pod "keda-metrics-apiserver-7c9f485588-kvp2n" (UID: "30fca5e7-9feb-49e8-bda1-9054c60cc8f5") : references non-existent secret key: tls.crt Apr 16 18:09:19.001003 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:19.000969 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:19.003317 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:19.003296 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/47a53de2-a20e-485a-b17c-c9b1ea16e963-certificates\") pod \"keda-operator-ffbb595cb-f8mdk\" (UID: \"47a53de2-a20e-485a-b17c-c9b1ea16e963\") " pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:19.044169 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:19.044131 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:19.162336 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:19.162315 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-f8mdk"] Apr 16 18:09:19.164868 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:09:19.164843 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47a53de2_a20e_485a_b17c_c9b1ea16e963.slice/crio-6c8991fb60ce3c935036e14ed17b73860c8f6fe7497e9bf04ab53001f0962412 WatchSource:0}: Error finding container 6c8991fb60ce3c935036e14ed17b73860c8f6fe7497e9bf04ab53001f0962412: Status 404 returned error can't find the container with id 6c8991fb60ce3c935036e14ed17b73860c8f6fe7497e9bf04ab53001f0962412 Apr 16 18:09:19.303480 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:19.303384 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:19.305964 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:19.305944 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30fca5e7-9feb-49e8-bda1-9054c60cc8f5-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kvp2n\" (UID: \"30fca5e7-9feb-49e8-bda1-9054c60cc8f5\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:19.320726 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:19.320699 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:19.434153 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:19.434066 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" event={"ID":"47a53de2-a20e-485a-b17c-c9b1ea16e963","Type":"ContainerStarted","Data":"6c8991fb60ce3c935036e14ed17b73860c8f6fe7497e9bf04ab53001f0962412"} Apr 16 18:09:19.436085 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:19.436025 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n"] Apr 16 18:09:19.438495 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:09:19.438468 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30fca5e7_9feb_49e8_bda1_9054c60cc8f5.slice/crio-1efcd16310980f1edc548187279dc34c3ece76949f3196b6debb3553b5390a0e WatchSource:0}: Error finding container 1efcd16310980f1edc548187279dc34c3ece76949f3196b6debb3553b5390a0e: Status 404 returned error can't find the container with id 1efcd16310980f1edc548187279dc34c3ece76949f3196b6debb3553b5390a0e Apr 16 18:09:20.438897 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:20.438851 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" event={"ID":"30fca5e7-9feb-49e8-bda1-9054c60cc8f5","Type":"ContainerStarted","Data":"1efcd16310980f1edc548187279dc34c3ece76949f3196b6debb3553b5390a0e"} Apr 16 18:09:22.446912 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:22.446859 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" event={"ID":"30fca5e7-9feb-49e8-bda1-9054c60cc8f5","Type":"ContainerStarted","Data":"15f89a06aabfb2d9ccad8d49fcdebf76412a90c4ffecc250581cc75215410a02"} Apr 16 18:09:22.447319 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:22.446990 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:22.470707 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:22.470651 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" podStartSLOduration=9.101175122 podStartE2EDuration="11.470633458s" podCreationTimestamp="2026-04-16 18:09:11 +0000 UTC" firstStartedPulling="2026-04-16 18:09:19.439652593 +0000 UTC m=+385.810326484" lastFinishedPulling="2026-04-16 18:09:21.809110931 +0000 UTC m=+388.179784820" observedRunningTime="2026-04-16 18:09:22.46864658 +0000 UTC m=+388.839320488" watchObservedRunningTime="2026-04-16 18:09:22.470633458 +0000 UTC m=+388.841307366" Apr 16 18:09:24.453405 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:24.453371 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" event={"ID":"47a53de2-a20e-485a-b17c-c9b1ea16e963","Type":"ContainerStarted","Data":"e36f53a7a2450006041b86ce80aa50fa5f9ee3ed761e3a6fc97068cec6d87893"} Apr 16 18:09:24.453843 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:24.453491 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:09:24.475247 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:24.475201 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" podStartSLOduration=8.468392197 podStartE2EDuration="13.475186155s" podCreationTimestamp="2026-04-16 18:09:11 +0000 UTC" firstStartedPulling="2026-04-16 18:09:19.166258376 +0000 UTC m=+385.536932261" lastFinishedPulling="2026-04-16 18:09:24.173052329 +0000 UTC m=+390.543726219" observedRunningTime="2026-04-16 18:09:24.473995841 +0000 UTC m=+390.844669748" watchObservedRunningTime="2026-04-16 18:09:24.475186155 +0000 UTC m=+390.845860061" Apr 16 18:09:32.409774 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:32.409745 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-96jx9" Apr 16 18:09:33.454103 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:33.454075 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kvp2n" Apr 16 18:09:34.417115 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:34.417080 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-78ncd" Apr 16 18:09:45.458557 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:09:45.458522 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-f8mdk" Apr 16 18:10:17.121834 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.121756 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85965f96-stmbt"] Apr 16 18:10:17.124760 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.124740 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.158221 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.158196 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85965f96-stmbt"] Apr 16 18:10:17.226755 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.226719 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-trusted-ca-bundle\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.226755 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.226754 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-oauth-serving-cert\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.226966 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.226856 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-console-config\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.226966 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.226909 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8099687f-0289-46d4-86fc-cad287925308-console-serving-cert\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.226966 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.226941 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8099687f-0289-46d4-86fc-cad287925308-console-oauth-config\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.227116 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.227036 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-service-ca\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.227116 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.227087 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28s8w\" (UniqueName: \"kubernetes.io/projected/8099687f-0289-46d4-86fc-cad287925308-kube-api-access-28s8w\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.328137 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.328107 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-trusted-ca-bundle\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.328314 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.328142 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-oauth-serving-cert\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.328314 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.328165 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-console-config\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.328314 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.328191 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8099687f-0289-46d4-86fc-cad287925308-console-serving-cert\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.328314 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.328223 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8099687f-0289-46d4-86fc-cad287925308-console-oauth-config\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.328314 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.328256 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-service-ca\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.328314 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.328284 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28s8w\" (UniqueName: \"kubernetes.io/projected/8099687f-0289-46d4-86fc-cad287925308-kube-api-access-28s8w\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.329014 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.328988 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-console-config\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.329132 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.328979 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-oauth-serving-cert\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.329132 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.329044 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-service-ca\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.329212 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.329136 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8099687f-0289-46d4-86fc-cad287925308-trusted-ca-bundle\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.330662 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.330642 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8099687f-0289-46d4-86fc-cad287925308-console-oauth-config\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.330763 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.330747 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8099687f-0289-46d4-86fc-cad287925308-console-serving-cert\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.339832 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.339803 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28s8w\" (UniqueName: \"kubernetes.io/projected/8099687f-0289-46d4-86fc-cad287925308-kube-api-access-28s8w\") pod \"console-85965f96-stmbt\" (UID: \"8099687f-0289-46d4-86fc-cad287925308\") " pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.433870 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.433789 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:17.554289 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.553840 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85965f96-stmbt"] Apr 16 18:10:17.614276 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:17.614250 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85965f96-stmbt" event={"ID":"8099687f-0289-46d4-86fc-cad287925308","Type":"ContainerStarted","Data":"3db8c6e3a000c36a7833bf07be63b3d01653315713b51f16b875329ce61e589e"} Apr 16 18:10:18.618885 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:18.618847 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85965f96-stmbt" event={"ID":"8099687f-0289-46d4-86fc-cad287925308","Type":"ContainerStarted","Data":"99806b8d624a68844ca7e204433599bba144562537a6a957260918e74864df01"} Apr 16 18:10:18.655514 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:18.655461 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85965f96-stmbt" podStartSLOduration=1.655446692 podStartE2EDuration="1.655446692s" podCreationTimestamp="2026-04-16 18:10:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:18.653158484 +0000 UTC m=+445.023832392" watchObservedRunningTime="2026-04-16 18:10:18.655446692 +0000 UTC m=+445.026120599" Apr 16 18:10:20.563183 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.563148 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-qj26m"] Apr 16 18:10:20.566475 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.566458 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:20.569131 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.569106 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 18:10:20.569243 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.569145 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:10:20.569243 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.569166 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-6p7jb\"" Apr 16 18:10:20.570318 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.570305 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:10:20.576323 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.576297 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-qj26m"] Apr 16 18:10:20.604447 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.604417 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-8twjm"] Apr 16 18:10:20.607964 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.607941 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-8twjm" Apr 16 18:10:20.610669 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.610646 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-h7q2z\"" Apr 16 18:10:20.610770 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.610650 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:10:20.617523 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.617501 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-8twjm"] Apr 16 18:10:20.656557 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.656530 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1e60c750-bf58-443d-b273-ebd1bf61057b-data\") pod \"seaweedfs-86cc847c5c-8twjm\" (UID: \"1e60c750-bf58-443d-b273-ebd1bf61057b\") " pod="kserve/seaweedfs-86cc847c5c-8twjm" Apr 16 18:10:20.656734 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.656582 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrhvf\" (UniqueName: \"kubernetes.io/projected/1e60c750-bf58-443d-b273-ebd1bf61057b-kube-api-access-rrhvf\") pod \"seaweedfs-86cc847c5c-8twjm\" (UID: \"1e60c750-bf58-443d-b273-ebd1bf61057b\") " pod="kserve/seaweedfs-86cc847c5c-8twjm" Apr 16 18:10:20.656734 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.656714 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-cert\") pod \"kserve-controller-manager-659c8cbdc-qj26m\" (UID: \"3bedd424-08a9-430a-88d4-30cd9ed2d0cf\") " pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:20.656828 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.656763 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87zpk\" (UniqueName: \"kubernetes.io/projected/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-kube-api-access-87zpk\") pod \"kserve-controller-manager-659c8cbdc-qj26m\" (UID: \"3bedd424-08a9-430a-88d4-30cd9ed2d0cf\") " pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:20.757738 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.757706 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-cert\") pod \"kserve-controller-manager-659c8cbdc-qj26m\" (UID: \"3bedd424-08a9-430a-88d4-30cd9ed2d0cf\") " pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:20.757908 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.757756 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87zpk\" (UniqueName: \"kubernetes.io/projected/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-kube-api-access-87zpk\") pod \"kserve-controller-manager-659c8cbdc-qj26m\" (UID: \"3bedd424-08a9-430a-88d4-30cd9ed2d0cf\") " pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:20.757908 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.757779 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1e60c750-bf58-443d-b273-ebd1bf61057b-data\") pod \"seaweedfs-86cc847c5c-8twjm\" (UID: \"1e60c750-bf58-443d-b273-ebd1bf61057b\") " pod="kserve/seaweedfs-86cc847c5c-8twjm" Apr 16 18:10:20.757908 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.757803 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrhvf\" (UniqueName: \"kubernetes.io/projected/1e60c750-bf58-443d-b273-ebd1bf61057b-kube-api-access-rrhvf\") pod \"seaweedfs-86cc847c5c-8twjm\" (UID: \"1e60c750-bf58-443d-b273-ebd1bf61057b\") " pod="kserve/seaweedfs-86cc847c5c-8twjm" Apr 16 18:10:20.758185 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.758163 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/1e60c750-bf58-443d-b273-ebd1bf61057b-data\") pod \"seaweedfs-86cc847c5c-8twjm\" (UID: \"1e60c750-bf58-443d-b273-ebd1bf61057b\") " pod="kserve/seaweedfs-86cc847c5c-8twjm" Apr 16 18:10:20.759990 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.759964 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-cert\") pod \"kserve-controller-manager-659c8cbdc-qj26m\" (UID: \"3bedd424-08a9-430a-88d4-30cd9ed2d0cf\") " pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:20.766215 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.766193 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrhvf\" (UniqueName: \"kubernetes.io/projected/1e60c750-bf58-443d-b273-ebd1bf61057b-kube-api-access-rrhvf\") pod \"seaweedfs-86cc847c5c-8twjm\" (UID: \"1e60c750-bf58-443d-b273-ebd1bf61057b\") " pod="kserve/seaweedfs-86cc847c5c-8twjm" Apr 16 18:10:20.766465 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.766447 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87zpk\" (UniqueName: \"kubernetes.io/projected/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-kube-api-access-87zpk\") pod \"kserve-controller-manager-659c8cbdc-qj26m\" (UID: \"3bedd424-08a9-430a-88d4-30cd9ed2d0cf\") " pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:20.877794 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.877712 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:20.919461 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:20.919429 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-8twjm" Apr 16 18:10:21.009569 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:21.009544 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-qj26m"] Apr 16 18:10:21.012038 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:10:21.012008 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bedd424_08a9_430a_88d4_30cd9ed2d0cf.slice/crio-1f78aa8237ffa5c6682b25917932809291e3dc85f15c4ae808316f918dad4995 WatchSource:0}: Error finding container 1f78aa8237ffa5c6682b25917932809291e3dc85f15c4ae808316f918dad4995: Status 404 returned error can't find the container with id 1f78aa8237ffa5c6682b25917932809291e3dc85f15c4ae808316f918dad4995 Apr 16 18:10:21.052476 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:21.052452 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-8twjm"] Apr 16 18:10:21.054549 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:10:21.054518 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e60c750_bf58_443d_b273_ebd1bf61057b.slice/crio-ff0e927f35a438e4bae294fa12b7972e2b0486812cf82e7db198dc6db97f757e WatchSource:0}: Error finding container ff0e927f35a438e4bae294fa12b7972e2b0486812cf82e7db198dc6db97f757e: Status 404 returned error can't find the container with id ff0e927f35a438e4bae294fa12b7972e2b0486812cf82e7db198dc6db97f757e Apr 16 18:10:21.631294 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:21.631257 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-8twjm" event={"ID":"1e60c750-bf58-443d-b273-ebd1bf61057b","Type":"ContainerStarted","Data":"ff0e927f35a438e4bae294fa12b7972e2b0486812cf82e7db198dc6db97f757e"} Apr 16 18:10:21.633280 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:21.633247 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" event={"ID":"3bedd424-08a9-430a-88d4-30cd9ed2d0cf","Type":"ContainerStarted","Data":"1f78aa8237ffa5c6682b25917932809291e3dc85f15c4ae808316f918dad4995"} Apr 16 18:10:24.644909 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:24.644877 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" event={"ID":"3bedd424-08a9-430a-88d4-30cd9ed2d0cf","Type":"ContainerStarted","Data":"2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d"} Apr 16 18:10:24.645214 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:24.645033 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:24.663436 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:24.663390 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" podStartSLOduration=1.170865638 podStartE2EDuration="4.663373981s" podCreationTimestamp="2026-04-16 18:10:20 +0000 UTC" firstStartedPulling="2026-04-16 18:10:21.01347223 +0000 UTC m=+447.384146115" lastFinishedPulling="2026-04-16 18:10:24.505980569 +0000 UTC m=+450.876654458" observedRunningTime="2026-04-16 18:10:24.661738347 +0000 UTC m=+451.032412255" watchObservedRunningTime="2026-04-16 18:10:24.663373981 +0000 UTC m=+451.034047889" Apr 16 18:10:25.649804 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:25.649765 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-8twjm" event={"ID":"1e60c750-bf58-443d-b273-ebd1bf61057b","Type":"ContainerStarted","Data":"293b2651ed2fe655ff8d1e5daed62ae4782e89bef6e1eec2413d4c146b374ec8"} Apr 16 18:10:25.650186 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:25.649886 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-8twjm" Apr 16 18:10:25.667757 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:25.667716 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-8twjm" podStartSLOduration=2.084599436 podStartE2EDuration="5.667700995s" podCreationTimestamp="2026-04-16 18:10:20 +0000 UTC" firstStartedPulling="2026-04-16 18:10:21.05580587 +0000 UTC m=+447.426479756" lastFinishedPulling="2026-04-16 18:10:24.638907426 +0000 UTC m=+451.009581315" observedRunningTime="2026-04-16 18:10:25.666421221 +0000 UTC m=+452.037095129" watchObservedRunningTime="2026-04-16 18:10:25.667700995 +0000 UTC m=+452.038374904" Apr 16 18:10:27.434762 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:27.434724 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:27.435209 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:27.434772 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:27.439426 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:27.439408 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:27.660294 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:27.660267 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85965f96-stmbt" Apr 16 18:10:27.707715 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:27.707687 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c6bf65995-f66zj"] Apr 16 18:10:31.656066 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:31.656037 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-8twjm" Apr 16 18:10:52.731680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:52.731633 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c6bf65995-f66zj" podUID="8163b196-f081-4019-94b4-c6f33d33e641" containerName="console" containerID="cri-o://d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9" gracePeriod=15 Apr 16 18:10:52.972013 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:52.971990 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c6bf65995-f66zj_8163b196-f081-4019-94b4-c6f33d33e641/console/0.log" Apr 16 18:10:52.972118 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:52.972053 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:10:53.033800 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.033723 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-service-ca\") pod \"8163b196-f081-4019-94b4-c6f33d33e641\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " Apr 16 18:10:53.033800 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.033761 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-oauth-serving-cert\") pod \"8163b196-f081-4019-94b4-c6f33d33e641\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " Apr 16 18:10:53.033800 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.033781 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v7l6\" (UniqueName: \"kubernetes.io/projected/8163b196-f081-4019-94b4-c6f33d33e641-kube-api-access-7v7l6\") pod \"8163b196-f081-4019-94b4-c6f33d33e641\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " Apr 16 18:10:53.033800 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.033799 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-console-config\") pod \"8163b196-f081-4019-94b4-c6f33d33e641\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " Apr 16 18:10:53.034084 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.033929 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-trusted-ca-bundle\") pod \"8163b196-f081-4019-94b4-c6f33d33e641\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " Apr 16 18:10:53.034084 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.033967 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-serving-cert\") pod \"8163b196-f081-4019-94b4-c6f33d33e641\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " Apr 16 18:10:53.034084 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.034067 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-oauth-config\") pod \"8163b196-f081-4019-94b4-c6f33d33e641\" (UID: \"8163b196-f081-4019-94b4-c6f33d33e641\") " Apr 16 18:10:53.034229 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.034199 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-service-ca" (OuterVolumeSpecName: "service-ca") pod "8163b196-f081-4019-94b4-c6f33d33e641" (UID: "8163b196-f081-4019-94b4-c6f33d33e641"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:10:53.034291 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.034226 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8163b196-f081-4019-94b4-c6f33d33e641" (UID: "8163b196-f081-4019-94b4-c6f33d33e641"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:10:53.034343 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.034292 2564 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-service-ca\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:10:53.034343 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.034312 2564 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-oauth-serving-cert\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:10:53.034429 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.034388 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8163b196-f081-4019-94b4-c6f33d33e641" (UID: "8163b196-f081-4019-94b4-c6f33d33e641"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:10:53.034514 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.034491 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-console-config" (OuterVolumeSpecName: "console-config") pod "8163b196-f081-4019-94b4-c6f33d33e641" (UID: "8163b196-f081-4019-94b4-c6f33d33e641"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:10:53.036047 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.036014 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8163b196-f081-4019-94b4-c6f33d33e641-kube-api-access-7v7l6" (OuterVolumeSpecName: "kube-api-access-7v7l6") pod "8163b196-f081-4019-94b4-c6f33d33e641" (UID: "8163b196-f081-4019-94b4-c6f33d33e641"). InnerVolumeSpecName "kube-api-access-7v7l6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:10:53.036148 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.036094 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8163b196-f081-4019-94b4-c6f33d33e641" (UID: "8163b196-f081-4019-94b4-c6f33d33e641"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:10:53.036148 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.036107 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8163b196-f081-4019-94b4-c6f33d33e641" (UID: "8163b196-f081-4019-94b4-c6f33d33e641"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:10:53.135115 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.135072 2564 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-oauth-config\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:10:53.135115 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.135112 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7v7l6\" (UniqueName: \"kubernetes.io/projected/8163b196-f081-4019-94b4-c6f33d33e641-kube-api-access-7v7l6\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:10:53.135115 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.135123 2564 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-console-config\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:10:53.135115 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.135132 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8163b196-f081-4019-94b4-c6f33d33e641-trusted-ca-bundle\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:10:53.135366 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.135140 2564 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8163b196-f081-4019-94b4-c6f33d33e641-console-serving-cert\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:10:53.735979 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.735949 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c6bf65995-f66zj_8163b196-f081-4019-94b4-c6f33d33e641/console/0.log" Apr 16 18:10:53.736376 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.736042 2564 generic.go:358] "Generic (PLEG): container finished" podID="8163b196-f081-4019-94b4-c6f33d33e641" containerID="d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9" exitCode=2 Apr 16 18:10:53.736376 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.736110 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6bf65995-f66zj" Apr 16 18:10:53.736376 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.736127 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6bf65995-f66zj" event={"ID":"8163b196-f081-4019-94b4-c6f33d33e641","Type":"ContainerDied","Data":"d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9"} Apr 16 18:10:53.736376 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.736164 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6bf65995-f66zj" event={"ID":"8163b196-f081-4019-94b4-c6f33d33e641","Type":"ContainerDied","Data":"cb25139db46f33270659982a572292b3eeeac8400c083190574f7306ca162dfa"} Apr 16 18:10:53.736376 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.736179 2564 scope.go:117] "RemoveContainer" containerID="d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9" Apr 16 18:10:53.744341 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.744325 2564 scope.go:117] "RemoveContainer" containerID="d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9" Apr 16 18:10:53.744578 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:10:53.744561 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9\": container with ID starting with d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9 not found: ID does not exist" containerID="d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9" Apr 16 18:10:53.744678 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.744586 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9"} err="failed to get container status \"d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9\": rpc error: code = NotFound desc = could not find container \"d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9\": container with ID starting with d0ee0b2026b7ea051114bf5497f680d4a825502d9d3a9a1768e4f35632499bc9 not found: ID does not exist" Apr 16 18:10:53.757195 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.757175 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c6bf65995-f66zj"] Apr 16 18:10:53.761216 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:53.761197 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c6bf65995-f66zj"] Apr 16 18:10:54.192179 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:54.192139 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8163b196-f081-4019-94b4-c6f33d33e641" path="/var/lib/kubelet/pods/8163b196-f081-4019-94b4-c6f33d33e641/volumes" Apr 16 18:10:55.445624 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.445567 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-qj26m"] Apr 16 18:10:55.446062 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.445813 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" podUID="3bedd424-08a9-430a-88d4-30cd9ed2d0cf" containerName="manager" containerID="cri-o://2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d" gracePeriod=10 Apr 16 18:10:55.450991 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.450966 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:55.489502 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.489473 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-66ts7"] Apr 16 18:10:55.489868 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.489851 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8163b196-f081-4019-94b4-c6f33d33e641" containerName="console" Apr 16 18:10:55.489937 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.489871 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8163b196-f081-4019-94b4-c6f33d33e641" containerName="console" Apr 16 18:10:55.489937 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.489930 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="8163b196-f081-4019-94b4-c6f33d33e641" containerName="console" Apr 16 18:10:55.505451 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.505417 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" Apr 16 18:10:55.509979 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.509954 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-66ts7"] Apr 16 18:10:55.553515 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.553485 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnr6\" (UniqueName: \"kubernetes.io/projected/e42f2e09-d3c3-4ddd-8502-43a100894102-kube-api-access-5jnr6\") pod \"kserve-controller-manager-659c8cbdc-66ts7\" (UID: \"e42f2e09-d3c3-4ddd-8502-43a100894102\") " pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" Apr 16 18:10:55.553678 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.553531 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e42f2e09-d3c3-4ddd-8502-43a100894102-cert\") pod \"kserve-controller-manager-659c8cbdc-66ts7\" (UID: \"e42f2e09-d3c3-4ddd-8502-43a100894102\") " pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" Apr 16 18:10:55.654088 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.654061 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnr6\" (UniqueName: \"kubernetes.io/projected/e42f2e09-d3c3-4ddd-8502-43a100894102-kube-api-access-5jnr6\") pod \"kserve-controller-manager-659c8cbdc-66ts7\" (UID: \"e42f2e09-d3c3-4ddd-8502-43a100894102\") " pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" Apr 16 18:10:55.654225 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.654106 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e42f2e09-d3c3-4ddd-8502-43a100894102-cert\") pod \"kserve-controller-manager-659c8cbdc-66ts7\" (UID: \"e42f2e09-d3c3-4ddd-8502-43a100894102\") " pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" Apr 16 18:10:55.656540 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.656515 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e42f2e09-d3c3-4ddd-8502-43a100894102-cert\") pod \"kserve-controller-manager-659c8cbdc-66ts7\" (UID: \"e42f2e09-d3c3-4ddd-8502-43a100894102\") " pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" Apr 16 18:10:55.664256 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.664230 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnr6\" (UniqueName: \"kubernetes.io/projected/e42f2e09-d3c3-4ddd-8502-43a100894102-kube-api-access-5jnr6\") pod \"kserve-controller-manager-659c8cbdc-66ts7\" (UID: \"e42f2e09-d3c3-4ddd-8502-43a100894102\") " pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" Apr 16 18:10:55.700088 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.700040 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:55.744124 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.744081 2564 generic.go:358] "Generic (PLEG): container finished" podID="3bedd424-08a9-430a-88d4-30cd9ed2d0cf" containerID="2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d" exitCode=0 Apr 16 18:10:55.744275 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.744144 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" Apr 16 18:10:55.744275 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.744144 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" event={"ID":"3bedd424-08a9-430a-88d4-30cd9ed2d0cf","Type":"ContainerDied","Data":"2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d"} Apr 16 18:10:55.744275 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.744261 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-qj26m" event={"ID":"3bedd424-08a9-430a-88d4-30cd9ed2d0cf","Type":"ContainerDied","Data":"1f78aa8237ffa5c6682b25917932809291e3dc85f15c4ae808316f918dad4995"} Apr 16 18:10:55.744424 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.744284 2564 scope.go:117] "RemoveContainer" containerID="2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d" Apr 16 18:10:55.751948 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.751931 2564 scope.go:117] "RemoveContainer" containerID="2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d" Apr 16 18:10:55.752207 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:10:55.752189 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d\": container with ID starting with 2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d not found: ID does not exist" containerID="2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d" Apr 16 18:10:55.752268 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.752215 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d"} err="failed to get container status \"2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d\": rpc error: code = NotFound desc = could not find container \"2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d\": container with ID starting with 2e6755dbb367902728cc00a025b42a0d96e13e27201e471454aa13ee47c7e31d not found: ID does not exist" Apr 16 18:10:55.754468 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.754445 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-cert\") pod \"3bedd424-08a9-430a-88d4-30cd9ed2d0cf\" (UID: \"3bedd424-08a9-430a-88d4-30cd9ed2d0cf\") " Apr 16 18:10:55.754565 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.754536 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87zpk\" (UniqueName: \"kubernetes.io/projected/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-kube-api-access-87zpk\") pod \"3bedd424-08a9-430a-88d4-30cd9ed2d0cf\" (UID: \"3bedd424-08a9-430a-88d4-30cd9ed2d0cf\") " Apr 16 18:10:55.756441 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.756418 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-cert" (OuterVolumeSpecName: "cert") pod "3bedd424-08a9-430a-88d4-30cd9ed2d0cf" (UID: "3bedd424-08a9-430a-88d4-30cd9ed2d0cf"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:10:55.756765 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.756745 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-kube-api-access-87zpk" (OuterVolumeSpecName: "kube-api-access-87zpk") pod "3bedd424-08a9-430a-88d4-30cd9ed2d0cf" (UID: "3bedd424-08a9-430a-88d4-30cd9ed2d0cf"). InnerVolumeSpecName "kube-api-access-87zpk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:10:55.855840 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.855809 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87zpk\" (UniqueName: \"kubernetes.io/projected/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-kube-api-access-87zpk\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:10:55.855840 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.855837 2564 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bedd424-08a9-430a-88d4-30cd9ed2d0cf-cert\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:10:55.872202 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.872180 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" Apr 16 18:10:55.996860 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:55.996790 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-66ts7"] Apr 16 18:10:55.999997 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:10:55.999964 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode42f2e09_d3c3_4ddd_8502_43a100894102.slice/crio-838b17ab79f32d91451927386ecd36983fc68b5ea01eb0e19d7d0ac1df9c09b1 WatchSource:0}: Error finding container 838b17ab79f32d91451927386ecd36983fc68b5ea01eb0e19d7d0ac1df9c09b1: Status 404 returned error can't find the container with id 838b17ab79f32d91451927386ecd36983fc68b5ea01eb0e19d7d0ac1df9c09b1 Apr 16 18:10:56.069982 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:56.069957 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-qj26m"] Apr 16 18:10:56.075071 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:56.075049 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-qj26m"] Apr 16 18:10:56.192828 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:56.192802 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bedd424-08a9-430a-88d4-30cd9ed2d0cf" path="/var/lib/kubelet/pods/3bedd424-08a9-430a-88d4-30cd9ed2d0cf/volumes" Apr 16 18:10:56.749024 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:56.748992 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" event={"ID":"e42f2e09-d3c3-4ddd-8502-43a100894102","Type":"ContainerStarted","Data":"8d31de5e52d621ef0c49d8cb45a8ae918daed1291f3f22aea156e76dc2679cdd"} Apr 16 18:10:56.749024 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:56.749028 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" event={"ID":"e42f2e09-d3c3-4ddd-8502-43a100894102","Type":"ContainerStarted","Data":"838b17ab79f32d91451927386ecd36983fc68b5ea01eb0e19d7d0ac1df9c09b1"} Apr 16 18:10:56.749422 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:56.749109 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" Apr 16 18:10:56.772095 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:10:56.772055 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" podStartSLOduration=1.351741592 podStartE2EDuration="1.772042372s" podCreationTimestamp="2026-04-16 18:10:55 +0000 UTC" firstStartedPulling="2026-04-16 18:10:56.001251748 +0000 UTC m=+482.371925633" lastFinishedPulling="2026-04-16 18:10:56.421552528 +0000 UTC m=+482.792226413" observedRunningTime="2026-04-16 18:10:56.770792817 +0000 UTC m=+483.141466724" watchObservedRunningTime="2026-04-16 18:10:56.772042372 +0000 UTC m=+483.142716280" Apr 16 18:11:27.757755 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:27.757726 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-66ts7" Apr 16 18:11:29.068285 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.068245 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-5lmbt"] Apr 16 18:11:29.068794 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.068753 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bedd424-08a9-430a-88d4-30cd9ed2d0cf" containerName="manager" Apr 16 18:11:29.068794 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.068772 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bedd424-08a9-430a-88d4-30cd9ed2d0cf" containerName="manager" Apr 16 18:11:29.068906 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.068844 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bedd424-08a9-430a-88d4-30cd9ed2d0cf" containerName="manager" Apr 16 18:11:29.070494 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.070474 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-5lmbt" Apr 16 18:11:29.094910 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.094881 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 18:11:29.095037 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.094927 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-x62bf\"" Apr 16 18:11:29.117480 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.117450 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-5lmbt"] Apr 16 18:11:29.168078 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.168050 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-rsfcr"] Apr 16 18:11:29.169997 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.169982 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-rsfcr" Apr 16 18:11:29.175208 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.175187 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-k455r\"" Apr 16 18:11:29.175436 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.175423 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 18:11:29.189375 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.189354 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-rsfcr"] Apr 16 18:11:29.208137 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.208116 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc75215f-425f-46f9-bd94-f15eba5f1cac-tls-certs\") pod \"model-serving-api-86f7b4b499-5lmbt\" (UID: \"cc75215f-425f-46f9-bd94-f15eba5f1cac\") " pod="kserve/model-serving-api-86f7b4b499-5lmbt" Apr 16 18:11:29.208269 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.208146 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8fbs\" (UniqueName: \"kubernetes.io/projected/cc75215f-425f-46f9-bd94-f15eba5f1cac-kube-api-access-x8fbs\") pod \"model-serving-api-86f7b4b499-5lmbt\" (UID: \"cc75215f-425f-46f9-bd94-f15eba5f1cac\") " pod="kserve/model-serving-api-86f7b4b499-5lmbt" Apr 16 18:11:29.308972 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.308940 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc75215f-425f-46f9-bd94-f15eba5f1cac-tls-certs\") pod \"model-serving-api-86f7b4b499-5lmbt\" (UID: \"cc75215f-425f-46f9-bd94-f15eba5f1cac\") " pod="kserve/model-serving-api-86f7b4b499-5lmbt" Apr 16 18:11:29.308972 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.308977 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8fbs\" (UniqueName: \"kubernetes.io/projected/cc75215f-425f-46f9-bd94-f15eba5f1cac-kube-api-access-x8fbs\") pod \"model-serving-api-86f7b4b499-5lmbt\" (UID: \"cc75215f-425f-46f9-bd94-f15eba5f1cac\") " pod="kserve/model-serving-api-86f7b4b499-5lmbt" Apr 16 18:11:29.309203 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.309004 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74040014-c443-4f97-a807-12a7f7988df3-cert\") pod \"odh-model-controller-696fc77849-rsfcr\" (UID: \"74040014-c443-4f97-a807-12a7f7988df3\") " pod="kserve/odh-model-controller-696fc77849-rsfcr" Apr 16 18:11:29.309203 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.309022 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8scz\" (UniqueName: \"kubernetes.io/projected/74040014-c443-4f97-a807-12a7f7988df3-kube-api-access-b8scz\") pod \"odh-model-controller-696fc77849-rsfcr\" (UID: \"74040014-c443-4f97-a807-12a7f7988df3\") " pod="kserve/odh-model-controller-696fc77849-rsfcr" Apr 16 18:11:29.311482 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.311449 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc75215f-425f-46f9-bd94-f15eba5f1cac-tls-certs\") pod \"model-serving-api-86f7b4b499-5lmbt\" (UID: \"cc75215f-425f-46f9-bd94-f15eba5f1cac\") " pod="kserve/model-serving-api-86f7b4b499-5lmbt" Apr 16 18:11:29.320016 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.319960 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8fbs\" (UniqueName: \"kubernetes.io/projected/cc75215f-425f-46f9-bd94-f15eba5f1cac-kube-api-access-x8fbs\") pod \"model-serving-api-86f7b4b499-5lmbt\" (UID: \"cc75215f-425f-46f9-bd94-f15eba5f1cac\") " pod="kserve/model-serving-api-86f7b4b499-5lmbt" Apr 16 18:11:29.380357 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.380330 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-5lmbt" Apr 16 18:11:29.409762 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.409731 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74040014-c443-4f97-a807-12a7f7988df3-cert\") pod \"odh-model-controller-696fc77849-rsfcr\" (UID: \"74040014-c443-4f97-a807-12a7f7988df3\") " pod="kserve/odh-model-controller-696fc77849-rsfcr" Apr 16 18:11:29.409888 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.409776 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8scz\" (UniqueName: \"kubernetes.io/projected/74040014-c443-4f97-a807-12a7f7988df3-kube-api-access-b8scz\") pod \"odh-model-controller-696fc77849-rsfcr\" (UID: \"74040014-c443-4f97-a807-12a7f7988df3\") " pod="kserve/odh-model-controller-696fc77849-rsfcr" Apr 16 18:11:29.411996 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.411974 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74040014-c443-4f97-a807-12a7f7988df3-cert\") pod \"odh-model-controller-696fc77849-rsfcr\" (UID: \"74040014-c443-4f97-a807-12a7f7988df3\") " pod="kserve/odh-model-controller-696fc77849-rsfcr" Apr 16 18:11:29.422884 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.422856 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8scz\" (UniqueName: \"kubernetes.io/projected/74040014-c443-4f97-a807-12a7f7988df3-kube-api-access-b8scz\") pod \"odh-model-controller-696fc77849-rsfcr\" (UID: \"74040014-c443-4f97-a807-12a7f7988df3\") " pod="kserve/odh-model-controller-696fc77849-rsfcr" Apr 16 18:11:29.479421 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.479382 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-rsfcr" Apr 16 18:11:29.518167 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.518138 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-5lmbt"] Apr 16 18:11:29.523263 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:11:29.523228 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc75215f_425f_46f9_bd94_f15eba5f1cac.slice/crio-bc562a6310c5d22de41d30e23be8982ed28732febf418f88f8fbf2740127a3c0 WatchSource:0}: Error finding container bc562a6310c5d22de41d30e23be8982ed28732febf418f88f8fbf2740127a3c0: Status 404 returned error can't find the container with id bc562a6310c5d22de41d30e23be8982ed28732febf418f88f8fbf2740127a3c0 Apr 16 18:11:29.642285 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.642214 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-rsfcr"] Apr 16 18:11:29.645013 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:11:29.644986 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74040014_c443_4f97_a807_12a7f7988df3.slice/crio-783c1320035426ab2c91507b8d0fc6a95c9ec50a1d969dd672acab36f6c50c37 WatchSource:0}: Error finding container 783c1320035426ab2c91507b8d0fc6a95c9ec50a1d969dd672acab36f6c50c37: Status 404 returned error can't find the container with id 783c1320035426ab2c91507b8d0fc6a95c9ec50a1d969dd672acab36f6c50c37 Apr 16 18:11:29.852998 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.852959 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-rsfcr" event={"ID":"74040014-c443-4f97-a807-12a7f7988df3","Type":"ContainerStarted","Data":"783c1320035426ab2c91507b8d0fc6a95c9ec50a1d969dd672acab36f6c50c37"} Apr 16 18:11:29.854175 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:29.854144 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-5lmbt" event={"ID":"cc75215f-425f-46f9-bd94-f15eba5f1cac","Type":"ContainerStarted","Data":"bc562a6310c5d22de41d30e23be8982ed28732febf418f88f8fbf2740127a3c0"} Apr 16 18:11:32.865594 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:32.865550 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-rsfcr" event={"ID":"74040014-c443-4f97-a807-12a7f7988df3","Type":"ContainerStarted","Data":"5c85d15a3c49054fde86eafe0657e8ff547a63c89718f366ee20809d657e1592"} Apr 16 18:11:32.866080 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:32.865868 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-rsfcr" Apr 16 18:11:32.867019 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:32.866992 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-5lmbt" event={"ID":"cc75215f-425f-46f9-bd94-f15eba5f1cac","Type":"ContainerStarted","Data":"9f6f31c600ee017f4d8db604746c573684c623c7968e2d4a97706c8e4f4be8a8"} Apr 16 18:11:32.867127 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:32.867103 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-5lmbt" Apr 16 18:11:32.896448 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:32.896402 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-rsfcr" podStartSLOduration=1.5081318609999999 podStartE2EDuration="3.896388265s" podCreationTimestamp="2026-04-16 18:11:29 +0000 UTC" firstStartedPulling="2026-04-16 18:11:29.646330411 +0000 UTC m=+516.017004300" lastFinishedPulling="2026-04-16 18:11:32.034586819 +0000 UTC m=+518.405260704" observedRunningTime="2026-04-16 18:11:32.896221006 +0000 UTC m=+519.266894925" watchObservedRunningTime="2026-04-16 18:11:32.896388265 +0000 UTC m=+519.267062166" Apr 16 18:11:32.950370 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:32.950322 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-5lmbt" podStartSLOduration=2.700115258 podStartE2EDuration="3.95031024s" podCreationTimestamp="2026-04-16 18:11:29 +0000 UTC" firstStartedPulling="2026-04-16 18:11:29.525540941 +0000 UTC m=+515.896214837" lastFinishedPulling="2026-04-16 18:11:30.775735922 +0000 UTC m=+517.146409819" observedRunningTime="2026-04-16 18:11:32.939132889 +0000 UTC m=+519.309806795" watchObservedRunningTime="2026-04-16 18:11:32.95031024 +0000 UTC m=+519.320984162" Apr 16 18:11:43.872712 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:43.872635 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-rsfcr" Apr 16 18:11:43.874782 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:43.874765 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-5lmbt" Apr 16 18:11:56.200286 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.200254 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q"] Apr 16 18:11:56.204898 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.204876 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" Apr 16 18:11:56.207943 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.207925 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 18:11:56.219385 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.219365 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q"] Apr 16 18:11:56.233252 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.233228 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-n4z2q\" (UID: \"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" Apr 16 18:11:56.233348 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.233266 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqwsq\" (UniqueName: \"kubernetes.io/projected/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-kube-api-access-rqwsq\") pod \"seaweedfs-tls-custom-ddd4dbfd-n4z2q\" (UID: \"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" Apr 16 18:11:56.333809 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.333772 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-n4z2q\" (UID: \"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" Apr 16 18:11:56.333809 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.333818 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqwsq\" (UniqueName: \"kubernetes.io/projected/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-kube-api-access-rqwsq\") pod \"seaweedfs-tls-custom-ddd4dbfd-n4z2q\" (UID: \"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" Apr 16 18:11:56.334125 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.334104 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-n4z2q\" (UID: \"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" Apr 16 18:11:56.347578 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.347545 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqwsq\" (UniqueName: \"kubernetes.io/projected/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-kube-api-access-rqwsq\") pod \"seaweedfs-tls-custom-ddd4dbfd-n4z2q\" (UID: \"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" Apr 16 18:11:56.513812 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.513723 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" Apr 16 18:11:56.643358 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.643324 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q"] Apr 16 18:11:56.648511 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:11:56.648484 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e45a826_bd9f_4ab8_83b8_0c56c5cdc5c6.slice/crio-47e40b2f855285d108ab7ccdaca468eaab02e9333ee81a6f5d435fde9aab20c9 WatchSource:0}: Error finding container 47e40b2f855285d108ab7ccdaca468eaab02e9333ee81a6f5d435fde9aab20c9: Status 404 returned error can't find the container with id 47e40b2f855285d108ab7ccdaca468eaab02e9333ee81a6f5d435fde9aab20c9 Apr 16 18:11:56.949808 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:56.949774 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" event={"ID":"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6","Type":"ContainerStarted","Data":"47e40b2f855285d108ab7ccdaca468eaab02e9333ee81a6f5d435fde9aab20c9"} Apr 16 18:11:57.954212 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:57.954169 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" event={"ID":"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6","Type":"ContainerStarted","Data":"eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036"} Apr 16 18:11:59.949931 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:59.949880 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" podStartSLOduration=3.605672781 podStartE2EDuration="3.949863272s" podCreationTimestamp="2026-04-16 18:11:56 +0000 UTC" firstStartedPulling="2026-04-16 18:11:56.649813163 +0000 UTC m=+543.020487048" lastFinishedPulling="2026-04-16 18:11:56.99400365 +0000 UTC m=+543.364677539" observedRunningTime="2026-04-16 18:11:57.994495266 +0000 UTC m=+544.365169187" watchObservedRunningTime="2026-04-16 18:11:59.949863272 +0000 UTC m=+546.320537179" Apr 16 18:11:59.950311 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:59.949998 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q"] Apr 16 18:11:59.960343 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:11:59.960313 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" podUID="3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6" containerName="seaweedfs-tls-custom" containerID="cri-o://eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036" gracePeriod=30 Apr 16 18:12:27.706477 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:27.706454 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" Apr 16 18:12:27.788028 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:27.787952 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqwsq\" (UniqueName: \"kubernetes.io/projected/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-kube-api-access-rqwsq\") pod \"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6\" (UID: \"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6\") " Apr 16 18:12:27.788028 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:27.787990 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-data\") pod \"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6\" (UID: \"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6\") " Apr 16 18:12:27.789247 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:27.789183 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-data" (OuterVolumeSpecName: "data") pod "3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6" (UID: "3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:27.790048 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:27.790015 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-kube-api-access-rqwsq" (OuterVolumeSpecName: "kube-api-access-rqwsq") pod "3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6" (UID: "3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6"). InnerVolumeSpecName "kube-api-access-rqwsq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:27.888600 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:27.888552 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rqwsq\" (UniqueName: \"kubernetes.io/projected/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-kube-api-access-rqwsq\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:12:27.888600 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:27.888595 2564 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6-data\") on node \"ip-10-0-131-193.ec2.internal\" DevicePath \"\"" Apr 16 18:12:28.048680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.048573 2564 generic.go:358] "Generic (PLEG): container finished" podID="3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6" containerID="eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036" exitCode=0 Apr 16 18:12:28.048680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.048651 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" Apr 16 18:12:28.048680 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.048665 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" event={"ID":"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6","Type":"ContainerDied","Data":"eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036"} Apr 16 18:12:28.048969 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.048704 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q" event={"ID":"3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6","Type":"ContainerDied","Data":"47e40b2f855285d108ab7ccdaca468eaab02e9333ee81a6f5d435fde9aab20c9"} Apr 16 18:12:28.048969 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.048724 2564 scope.go:117] "RemoveContainer" containerID="eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036" Apr 16 18:12:28.058409 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.058395 2564 scope.go:117] "RemoveContainer" containerID="eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036" Apr 16 18:12:28.058678 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:12:28.058658 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036\": container with ID starting with eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036 not found: ID does not exist" containerID="eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036" Apr 16 18:12:28.058733 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.058686 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036"} err="failed to get container status \"eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036\": rpc error: code = NotFound desc = could not find container \"eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036\": container with ID starting with eee32aab20714c2edc1cd5ab3d73adb57d9396248db1e018db90c7c78614f036 not found: ID does not exist" Apr 16 18:12:28.080573 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.080542 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q"] Apr 16 18:12:28.086062 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.086038 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-n4z2q"] Apr 16 18:12:28.163623 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.163579 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf"] Apr 16 18:12:28.163897 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.163885 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6" containerName="seaweedfs-tls-custom" Apr 16 18:12:28.163938 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.163899 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6" containerName="seaweedfs-tls-custom" Apr 16 18:12:28.163975 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.163960 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6" containerName="seaweedfs-tls-custom" Apr 16 18:12:28.171601 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.171577 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" Apr 16 18:12:28.176529 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.176504 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 18:12:28.176810 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.176788 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 16 18:12:28.192873 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.192838 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6" path="/var/lib/kubelet/pods/3e45a826-bd9f-4ab8-83b8-0c56c5cdc5c6/volumes" Apr 16 18:12:28.193107 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.193094 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf"] Apr 16 18:12:28.291003 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.290967 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/b9c61acb-ec6a-4724-8c37-65e282508129-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-x4hgf\" (UID: \"b9c61acb-ec6a-4724-8c37-65e282508129\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" Apr 16 18:12:28.291164 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.291076 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwf47\" (UniqueName: \"kubernetes.io/projected/b9c61acb-ec6a-4724-8c37-65e282508129-kube-api-access-cwf47\") pod \"seaweedfs-tls-custom-5c88b85bb7-x4hgf\" (UID: \"b9c61acb-ec6a-4724-8c37-65e282508129\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" Apr 16 18:12:28.291164 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.291120 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b9c61acb-ec6a-4724-8c37-65e282508129-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-x4hgf\" (UID: \"b9c61acb-ec6a-4724-8c37-65e282508129\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" Apr 16 18:12:28.392435 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.392356 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwf47\" (UniqueName: \"kubernetes.io/projected/b9c61acb-ec6a-4724-8c37-65e282508129-kube-api-access-cwf47\") pod \"seaweedfs-tls-custom-5c88b85bb7-x4hgf\" (UID: \"b9c61acb-ec6a-4724-8c37-65e282508129\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" Apr 16 18:12:28.392435 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.392394 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b9c61acb-ec6a-4724-8c37-65e282508129-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-x4hgf\" (UID: \"b9c61acb-ec6a-4724-8c37-65e282508129\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" Apr 16 18:12:28.392435 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.392414 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/b9c61acb-ec6a-4724-8c37-65e282508129-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-x4hgf\" (UID: \"b9c61acb-ec6a-4724-8c37-65e282508129\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" Apr 16 18:12:28.392790 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.392768 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b9c61acb-ec6a-4724-8c37-65e282508129-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-x4hgf\" (UID: \"b9c61acb-ec6a-4724-8c37-65e282508129\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" Apr 16 18:12:28.394847 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.394819 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/b9c61acb-ec6a-4724-8c37-65e282508129-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-x4hgf\" (UID: \"b9c61acb-ec6a-4724-8c37-65e282508129\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" Apr 16 18:12:28.405878 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.405854 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwf47\" (UniqueName: \"kubernetes.io/projected/b9c61acb-ec6a-4724-8c37-65e282508129-kube-api-access-cwf47\") pod \"seaweedfs-tls-custom-5c88b85bb7-x4hgf\" (UID: \"b9c61acb-ec6a-4724-8c37-65e282508129\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" Apr 16 18:12:28.480007 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.479974 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" Apr 16 18:12:28.614107 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:28.613855 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf"] Apr 16 18:12:28.616440 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:12:28.616413 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9c61acb_ec6a_4724_8c37_65e282508129.slice/crio-29a63872fcf4a7659eb38af8f396361531ae22108797b71bd99069a755aadec4 WatchSource:0}: Error finding container 29a63872fcf4a7659eb38af8f396361531ae22108797b71bd99069a755aadec4: Status 404 returned error can't find the container with id 29a63872fcf4a7659eb38af8f396361531ae22108797b71bd99069a755aadec4 Apr 16 18:12:29.054104 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:29.054063 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" event={"ID":"b9c61acb-ec6a-4724-8c37-65e282508129","Type":"ContainerStarted","Data":"a08608d8e617c955bd63dd0c498fd7ef8e0496597e67769d4d61db59f01bb686"} Apr 16 18:12:29.054104 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:29.054098 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" event={"ID":"b9c61acb-ec6a-4724-8c37-65e282508129","Type":"ContainerStarted","Data":"29a63872fcf4a7659eb38af8f396361531ae22108797b71bd99069a755aadec4"} Apr 16 18:12:29.080403 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:29.080342 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-x4hgf" podStartSLOduration=0.840543025 podStartE2EDuration="1.080328525s" podCreationTimestamp="2026-04-16 18:12:28 +0000 UTC" firstStartedPulling="2026-04-16 18:12:28.617682613 +0000 UTC m=+574.988356499" lastFinishedPulling="2026-04-16 18:12:28.857468111 +0000 UTC m=+575.228141999" observedRunningTime="2026-04-16 18:12:29.078625542 +0000 UTC m=+575.449299439" watchObservedRunningTime="2026-04-16 18:12:29.080328525 +0000 UTC m=+575.451002431" Apr 16 18:12:38.156290 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.156253 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-psn84"] Apr 16 18:12:38.158380 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.158364 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" Apr 16 18:12:38.161189 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.161161 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 18:12:38.161447 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.161428 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 16 18:12:38.171352 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.171323 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-psn84"] Apr 16 18:12:38.278105 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.278070 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7wjq\" (UniqueName: \"kubernetes.io/projected/f93aae0e-4773-4e6f-8278-16e7e83f5b98-kube-api-access-m7wjq\") pod \"seaweedfs-tls-serving-7fd5766db9-psn84\" (UID: \"f93aae0e-4773-4e6f-8278-16e7e83f5b98\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" Apr 16 18:12:38.278294 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.278112 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/f93aae0e-4773-4e6f-8278-16e7e83f5b98-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-psn84\" (UID: \"f93aae0e-4773-4e6f-8278-16e7e83f5b98\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" Apr 16 18:12:38.278294 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.278222 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f93aae0e-4773-4e6f-8278-16e7e83f5b98-data\") pod \"seaweedfs-tls-serving-7fd5766db9-psn84\" (UID: \"f93aae0e-4773-4e6f-8278-16e7e83f5b98\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" Apr 16 18:12:38.379022 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.378985 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/f93aae0e-4773-4e6f-8278-16e7e83f5b98-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-psn84\" (UID: \"f93aae0e-4773-4e6f-8278-16e7e83f5b98\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" Apr 16 18:12:38.379184 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.379047 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f93aae0e-4773-4e6f-8278-16e7e83f5b98-data\") pod \"seaweedfs-tls-serving-7fd5766db9-psn84\" (UID: \"f93aae0e-4773-4e6f-8278-16e7e83f5b98\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" Apr 16 18:12:38.379184 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.379091 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7wjq\" (UniqueName: \"kubernetes.io/projected/f93aae0e-4773-4e6f-8278-16e7e83f5b98-kube-api-access-m7wjq\") pod \"seaweedfs-tls-serving-7fd5766db9-psn84\" (UID: \"f93aae0e-4773-4e6f-8278-16e7e83f5b98\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" Apr 16 18:12:38.379505 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.379484 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f93aae0e-4773-4e6f-8278-16e7e83f5b98-data\") pod \"seaweedfs-tls-serving-7fd5766db9-psn84\" (UID: \"f93aae0e-4773-4e6f-8278-16e7e83f5b98\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" Apr 16 18:12:38.381384 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.381362 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/f93aae0e-4773-4e6f-8278-16e7e83f5b98-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-psn84\" (UID: \"f93aae0e-4773-4e6f-8278-16e7e83f5b98\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" Apr 16 18:12:38.398958 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.398931 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7wjq\" (UniqueName: \"kubernetes.io/projected/f93aae0e-4773-4e6f-8278-16e7e83f5b98-kube-api-access-m7wjq\") pod \"seaweedfs-tls-serving-7fd5766db9-psn84\" (UID: \"f93aae0e-4773-4e6f-8278-16e7e83f5b98\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" Apr 16 18:12:38.469525 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.469498 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" Apr 16 18:12:38.603520 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:38.603497 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-psn84"] Apr 16 18:12:38.605526 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:12:38.605497 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93aae0e_4773_4e6f_8278_16e7e83f5b98.slice/crio-0eb72a822d77ae4cae975abaf731fb456873623b92359048845bae7772434c38 WatchSource:0}: Error finding container 0eb72a822d77ae4cae975abaf731fb456873623b92359048845bae7772434c38: Status 404 returned error can't find the container with id 0eb72a822d77ae4cae975abaf731fb456873623b92359048845bae7772434c38 Apr 16 18:12:39.088013 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:39.087975 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" event={"ID":"f93aae0e-4773-4e6f-8278-16e7e83f5b98","Type":"ContainerStarted","Data":"1cfa3f57457a2d7d526a249110283bd70407f9b8ab21c6a7e67cabfda47ee3ef"} Apr 16 18:12:39.088013 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:39.088013 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" event={"ID":"f93aae0e-4773-4e6f-8278-16e7e83f5b98","Type":"ContainerStarted","Data":"0eb72a822d77ae4cae975abaf731fb456873623b92359048845bae7772434c38"} Apr 16 18:12:39.117804 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:39.117754 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-psn84" podStartSLOduration=0.838316512 podStartE2EDuration="1.117739291s" podCreationTimestamp="2026-04-16 18:12:38 +0000 UTC" firstStartedPulling="2026-04-16 18:12:38.606650379 +0000 UTC m=+584.977324264" lastFinishedPulling="2026-04-16 18:12:38.886073157 +0000 UTC m=+585.256747043" observedRunningTime="2026-04-16 18:12:39.116567111 +0000 UTC m=+585.487241017" watchObservedRunningTime="2026-04-16 18:12:39.117739291 +0000 UTC m=+585.488413197" Apr 16 18:12:54.088937 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:54.088905 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:12:54.089800 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:12:54.089782 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:15:56.959400 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:56.959360 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb"] Apr 16 18:15:56.962657 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:56.962633 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" Apr 16 18:15:56.965885 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:56.965864 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-89kqm\"" Apr 16 18:15:56.971190 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:56.971163 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb"] Apr 16 18:15:56.973154 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:56.973135 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" Apr 16 18:15:57.106561 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:57.106538 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb"] Apr 16 18:15:57.109227 ip-10-0-131-193 kubenswrapper[2564]: W0416 18:15:57.109196 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb995a8_f33d_4d59_855b_f0009c36392f.slice/crio-2ee81e6a181368f815fd71b7cc5a9e8bc515def52001549fd373114242b7aea3 WatchSource:0}: Error finding container 2ee81e6a181368f815fd71b7cc5a9e8bc515def52001549fd373114242b7aea3: Status 404 returned error can't find the container with id 2ee81e6a181368f815fd71b7cc5a9e8bc515def52001549fd373114242b7aea3 Apr 16 18:15:57.110922 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:57.110905 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:15:57.706685 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:57.706645 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" event={"ID":"3bb995a8-f33d-4d59-855b-f0009c36392f","Type":"ContainerStarted","Data":"2ee81e6a181368f815fd71b7cc5a9e8bc515def52001549fd373114242b7aea3"} Apr 16 18:15:58.710852 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:58.710808 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" event={"ID":"3bb995a8-f33d-4d59-855b-f0009c36392f","Type":"ContainerStarted","Data":"dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37"} Apr 16 18:15:58.711314 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:58.711003 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" Apr 16 18:15:58.712695 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:58.712674 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" Apr 16 18:15:58.727693 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:15:58.727650 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" podStartSLOduration=1.7579722690000001 podStartE2EDuration="2.727634697s" podCreationTimestamp="2026-04-16 18:15:56 +0000 UTC" firstStartedPulling="2026-04-16 18:15:57.111027121 +0000 UTC m=+783.481701006" lastFinishedPulling="2026-04-16 18:15:58.080689546 +0000 UTC m=+784.451363434" observedRunningTime="2026-04-16 18:15:58.72706187 +0000 UTC m=+785.097735777" watchObservedRunningTime="2026-04-16 18:15:58.727634697 +0000 UTC m=+785.098308604" Apr 16 18:17:32.034975 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:32.034941 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-5blzb_3bb995a8-f33d-4d59-855b-f0009c36392f/kserve-container/0.log" Apr 16 18:17:32.347004 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:32.346917 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb"] Apr 16 18:17:32.347190 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:32.347166 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" podUID="3bb995a8-f33d-4d59-855b-f0009c36392f" containerName="kserve-container" containerID="cri-o://dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37" gracePeriod=30 Apr 16 18:17:32.597766 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:32.597709 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" Apr 16 18:17:33.007236 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:33.007197 2564 generic.go:358] "Generic (PLEG): container finished" podID="3bb995a8-f33d-4d59-855b-f0009c36392f" containerID="dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37" exitCode=2 Apr 16 18:17:33.007416 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:33.007253 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" Apr 16 18:17:33.007416 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:33.007285 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" event={"ID":"3bb995a8-f33d-4d59-855b-f0009c36392f","Type":"ContainerDied","Data":"dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37"} Apr 16 18:17:33.007416 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:33.007333 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb" event={"ID":"3bb995a8-f33d-4d59-855b-f0009c36392f","Type":"ContainerDied","Data":"2ee81e6a181368f815fd71b7cc5a9e8bc515def52001549fd373114242b7aea3"} Apr 16 18:17:33.007416 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:33.007350 2564 scope.go:117] "RemoveContainer" containerID="dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37" Apr 16 18:17:33.016029 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:33.016011 2564 scope.go:117] "RemoveContainer" containerID="dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37" Apr 16 18:17:33.016276 ip-10-0-131-193 kubenswrapper[2564]: E0416 18:17:33.016257 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37\": container with ID starting with dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37 not found: ID does not exist" containerID="dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37" Apr 16 18:17:33.016326 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:33.016285 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37"} err="failed to get container status \"dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37\": rpc error: code = NotFound desc = could not find container \"dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37\": container with ID starting with dd9a6577e50192c4efb137f6fe56eb01e9e9dbe4c3a1912c255a6ee172bc8a37 not found: ID does not exist" Apr 16 18:17:33.031121 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:33.031096 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb"] Apr 16 18:17:33.034406 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:33.034385 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-5blzb"] Apr 16 18:17:34.194250 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:34.194215 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb995a8-f33d-4d59-855b-f0009c36392f" path="/var/lib/kubelet/pods/3bb995a8-f33d-4d59-855b-f0009c36392f/volumes" Apr 16 18:17:54.109313 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:54.109235 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:17:54.113855 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:17:54.113578 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:22:54.134435 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:22:54.134408 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:22:54.139872 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:22:54.139849 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:27:54.153977 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:27:54.153941 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:27:54.160993 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:27:54.160974 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:32:54.175114 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:32:54.175083 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:32:54.182351 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:32:54.182327 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:37:54.196682 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:37:54.196655 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:37:54.203287 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:37:54.203266 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:42:54.217104 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:42:54.217076 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:42:54.224343 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:42:54.224323 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:47:54.237788 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:47:54.237667 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:47:54.244965 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:47:54.244944 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:52:54.258913 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:52:54.258789 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:52:54.265577 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:52:54.265557 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:57:54.278288 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:57:54.278187 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 18:57:54.290711 ip-10-0-131-193 kubenswrapper[2564]: I0416 18:57:54.290692 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 19:02:54.298983 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:02:54.298877 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 19:02:54.313848 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:02:54.313822 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 19:07:26.194113 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:26.194087 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-s8vvh_c81b1e66-a5cc-48eb-b28a-83101de54fcf/global-pull-secret-syncer/0.log" Apr 16 19:07:26.310527 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:26.310496 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xgk8q_6d32ccfb-d1d4-4972-9eae-bd16e530ce56/konnectivity-agent/0.log" Apr 16 19:07:26.372214 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:26.372186 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-193.ec2.internal_5c150e8aee5bd9091901cfeb87f90560/haproxy/0.log" Apr 16 19:07:30.418500 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:30.418476 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z5wsv_76cd3b9c-5d86-460e-a147-c0872c568583/node-exporter/0.log" Apr 16 19:07:30.448483 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:30.448456 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z5wsv_76cd3b9c-5d86-460e-a147-c0872c568583/kube-rbac-proxy/0.log" Apr 16 19:07:30.474463 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:30.474439 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z5wsv_76cd3b9c-5d86-460e-a147-c0872c568583/init-textfile/0.log" Apr 16 19:07:30.854001 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:30.853971 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-lbk7p_d929efe8-7163-4d2b-96b5-d313e3e8d987/prometheus-operator/0.log" Apr 16 19:07:30.877453 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:30.877425 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-lbk7p_d929efe8-7163-4d2b-96b5-d313e3e8d987/kube-rbac-proxy/0.log" Apr 16 19:07:30.904934 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:30.904914 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-h49fn_41432f2c-9579-47bd-9382-9700b101f44d/prometheus-operator-admission-webhook/0.log" Apr 16 19:07:32.355860 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:32.355830 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-pffzm_4aea7302-70a3-484c-bdf8-4a745df33639/networking-console-plugin/0.log" Apr 16 19:07:33.153599 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.153574 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85965f96-stmbt_8099687f-0289-46d4-86fc-cad287925308/console/0.log" Apr 16 19:07:33.193624 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.193579 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-9vgql_e4fac77e-76ed-47ba-a915-a3fad29536cf/download-server/0.log" Apr 16 19:07:33.334790 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.334753 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj"] Apr 16 19:07:33.335106 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.335092 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bb995a8-f33d-4d59-855b-f0009c36392f" containerName="kserve-container" Apr 16 19:07:33.335152 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.335109 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb995a8-f33d-4d59-855b-f0009c36392f" containerName="kserve-container" Apr 16 19:07:33.335186 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.335171 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bb995a8-f33d-4d59-855b-f0009c36392f" containerName="kserve-container" Apr 16 19:07:33.338205 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.338185 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.342280 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.342258 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s228v\"/\"kube-root-ca.crt\"" Apr 16 19:07:33.342375 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.342297 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-s228v\"/\"default-dockercfg-8h7sl\"" Apr 16 19:07:33.342440 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.342415 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s228v\"/\"openshift-service-ca.crt\"" Apr 16 19:07:33.348740 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.348715 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj"] Apr 16 19:07:33.426424 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.426339 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-sys\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.426424 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.426391 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-podres\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.426847 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.426452 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6z69\" (UniqueName: \"kubernetes.io/projected/dab814a6-ee2c-445c-9aad-35f837180267-kube-api-access-p6z69\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.426847 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.426513 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-proc\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.426847 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.426561 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-lib-modules\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.526990 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.526957 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-proc\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.527203 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.527002 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-lib-modules\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.527203 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.527049 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-sys\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.527203 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.527099 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-podres\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.527203 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.527100 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-proc\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.527203 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.527125 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6z69\" (UniqueName: \"kubernetes.io/projected/dab814a6-ee2c-445c-9aad-35f837180267-kube-api-access-p6z69\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.527203 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.527138 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-sys\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.527573 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.527213 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-podres\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.527573 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.527222 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dab814a6-ee2c-445c-9aad-35f837180267-lib-modules\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.544431 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.544398 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6z69\" (UniqueName: \"kubernetes.io/projected/dab814a6-ee2c-445c-9aad-35f837180267-kube-api-access-p6z69\") pod \"perf-node-gather-daemonset-6sjkj\" (UID: \"dab814a6-ee2c-445c-9aad-35f837180267\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.647473 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.647433 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:33.770715 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.770690 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj"] Apr 16 19:07:33.773438 ip-10-0-131-193 kubenswrapper[2564]: W0416 19:07:33.773408 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddab814a6_ee2c_445c_9aad_35f837180267.slice/crio-6de0df421175780b77ba105c8fa045ce8e35b91e82cf35072647d1c8030704f3 WatchSource:0}: Error finding container 6de0df421175780b77ba105c8fa045ce8e35b91e82cf35072647d1c8030704f3: Status 404 returned error can't find the container with id 6de0df421175780b77ba105c8fa045ce8e35b91e82cf35072647d1c8030704f3 Apr 16 19:07:33.775022 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:33.775005 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:07:34.321312 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:34.321284 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2t59p_8abd6ace-6d1c-4823-ba9a-f0f8e17c654c/dns/0.log" Apr 16 19:07:34.347089 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:34.347062 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2t59p_8abd6ace-6d1c-4823-ba9a-f0f8e17c654c/kube-rbac-proxy/0.log" Apr 16 19:07:34.439850 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:34.439813 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" event={"ID":"dab814a6-ee2c-445c-9aad-35f837180267","Type":"ContainerStarted","Data":"d7a5d8f11f5acfb2ce3789d83d99f0b575fd252bc8c40168dbdd4ee4e80761c6"} Apr 16 19:07:34.439850 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:34.439853 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" event={"ID":"dab814a6-ee2c-445c-9aad-35f837180267","Type":"ContainerStarted","Data":"6de0df421175780b77ba105c8fa045ce8e35b91e82cf35072647d1c8030704f3"} Apr 16 19:07:34.440256 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:34.439882 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:34.458114 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:34.458064 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" podStartSLOduration=1.458048634 podStartE2EDuration="1.458048634s" podCreationTimestamp="2026-04-16 19:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:07:34.456539886 +0000 UTC m=+3880.827213792" watchObservedRunningTime="2026-04-16 19:07:34.458048634 +0000 UTC m=+3880.828722538" Apr 16 19:07:34.498939 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:34.498910 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lk9pc_ee7d5ac7-7dde-40a2-97ed-71acf303ea5d/dns-node-resolver/0.log" Apr 16 19:07:34.993833 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:34.993785 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6984c869b6-m8kb6_a80ad2ed-b486-42ea-b4ae-c06c47c167f5/registry/0.log" Apr 16 19:07:35.073992 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:35.073966 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-q8z42_ab0cc5c6-ec31-4ed7-a4fa-9968260fc075/node-ca/0.log" Apr 16 19:07:36.143264 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:36.143236 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2htvb_5a9c1147-51d3-4765-a356-5a32f7bc0b67/serve-healthcheck-canary/0.log" Apr 16 19:07:36.807500 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:36.807472 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wlbw7_83dbfa60-19ba-415c-9df9-d865b7b18be8/kube-rbac-proxy/0.log" Apr 16 19:07:36.832507 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:36.832479 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wlbw7_83dbfa60-19ba-415c-9df9-d865b7b18be8/exporter/0.log" Apr 16 19:07:36.857444 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:36.857419 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wlbw7_83dbfa60-19ba-415c-9df9-d865b7b18be8/extractor/0.log" Apr 16 19:07:38.921819 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:38.921788 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-659c8cbdc-66ts7_e42f2e09-d3c3-4ddd-8502-43a100894102/manager/0.log" Apr 16 19:07:39.059383 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:39.059350 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-5lmbt_cc75215f-425f-46f9-bd94-f15eba5f1cac/server/0.log" Apr 16 19:07:39.290188 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:39.290157 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-rsfcr_74040014-c443-4f97-a807-12a7f7988df3/manager/0.log" Apr 16 19:07:39.521938 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:39.521903 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-8twjm_1e60c750-bf58-443d-b273-ebd1bf61057b/seaweedfs/0.log" Apr 16 19:07:39.558895 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:39.558809 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-x4hgf_b9c61acb-ec6a-4724-8c37-65e282508129/seaweedfs-tls-custom/0.log" Apr 16 19:07:39.600151 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:39.600125 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-psn84_f93aae0e-4773-4e6f-8278-16e7e83f5b98/seaweedfs-tls-serving/0.log" Apr 16 19:07:40.453302 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:40.453278 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-6sjkj" Apr 16 19:07:43.803075 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:43.803045 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-t72vw_7b731140-2291-4c78-96fd-b58bafe82dc6/migrator/0.log" Apr 16 19:07:43.827255 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:43.827231 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-t72vw_7b731140-2291-4c78-96fd-b58bafe82dc6/graceful-termination/0.log" Apr 16 19:07:45.144524 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:45.144445 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lf54m_4f6f788f-55c3-4a55-8674-eb821c1fc81e/kube-multus-additional-cni-plugins/0.log" Apr 16 19:07:45.169732 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:45.169709 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lf54m_4f6f788f-55c3-4a55-8674-eb821c1fc81e/egress-router-binary-copy/0.log" Apr 16 19:07:45.192974 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:45.192953 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lf54m_4f6f788f-55c3-4a55-8674-eb821c1fc81e/cni-plugins/0.log" Apr 16 19:07:45.217119 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:45.217097 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lf54m_4f6f788f-55c3-4a55-8674-eb821c1fc81e/bond-cni-plugin/0.log" Apr 16 19:07:45.241051 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:45.241024 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lf54m_4f6f788f-55c3-4a55-8674-eb821c1fc81e/routeoverride-cni/0.log" Apr 16 19:07:45.269244 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:45.269219 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lf54m_4f6f788f-55c3-4a55-8674-eb821c1fc81e/whereabouts-cni-bincopy/0.log" Apr 16 19:07:45.300084 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:45.300061 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lf54m_4f6f788f-55c3-4a55-8674-eb821c1fc81e/whereabouts-cni/0.log" Apr 16 19:07:45.774173 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:45.774145 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hwxs7_2bcf7a84-59f0-42fe-970c-4db3c9cf9a52/kube-multus/0.log" Apr 16 19:07:45.844081 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:45.844049 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2xjws_47436224-f5c2-417d-bfdc-61bfb9b8939a/network-metrics-daemon/0.log" Apr 16 19:07:45.870726 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:45.870698 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2xjws_47436224-f5c2-417d-bfdc-61bfb9b8939a/kube-rbac-proxy/0.log" Apr 16 19:07:47.516981 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:47.516953 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-controller/0.log" Apr 16 19:07:47.538045 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:47.538011 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/0.log" Apr 16 19:07:47.557350 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:47.557324 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovn-acl-logging/1.log" Apr 16 19:07:47.581667 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:47.581643 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/kube-rbac-proxy-node/0.log" Apr 16 19:07:47.605144 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:47.605119 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:07:47.626679 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:47.626660 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/northd/0.log" Apr 16 19:07:47.651651 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:47.651633 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/nbdb/0.log" Apr 16 19:07:47.674623 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:47.674589 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/sbdb/0.log" Apr 16 19:07:47.782704 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:47.782634 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxh6d_8aff675d-044c-4bf6-94c2-f3b57d9c5b7a/ovnkube-controller/0.log" Apr 16 19:07:48.882605 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:48.882573 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xddxf_fff8e593-e247-4d63-b01b-84beef483739/network-check-target-container/0.log" Apr 16 19:07:49.862330 ip-10-0-131-193 kubenswrapper[2564]: I0416 19:07:49.862306 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-q97cg_06425cb3-b57a-4b2a-80fc-58a19bb8366e/iptables-alerter/0.log"