Apr 16 13:58:52.560423 ip-10-0-130-98 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:58:52.949702 ip-10-0-130-98 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:58:52.949702 ip-10-0-130-98 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:58:52.949702 ip-10-0-130-98 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:58:52.949702 ip-10-0-130-98 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:58:52.949702 ip-10-0-130-98 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:58:52.954873 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.954781 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.959954 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960026 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960033 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960039 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960045 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960050 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960057 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960064 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960070 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960075 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960079 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960103 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960109 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960113 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960117 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960121 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960126 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960132 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:58:52.960157 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960136 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960140 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960144 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960149 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960153 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960158 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960167 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960172 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960176 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960181 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960186 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960201 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960229 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960235 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960240 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960243 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960246 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960249 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960252 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960254 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:58:52.960678 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960257 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960260 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960263 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960267 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960293 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960385 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960389 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960393 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960396 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960400 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960404 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960408 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960416 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960421 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960425 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960430 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960434 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960438 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960445 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960451 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:58:52.961209 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960456 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960461 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960465 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960470 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960474 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960478 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960482 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960486 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960490 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960494 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960498 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960503 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960507 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960511 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960518 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960522 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960525 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960529 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960533 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960538 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:58:52.961696 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960542 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960546 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960551 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960554 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960557 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960560 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960564 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960567 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960949 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960955 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960958 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960960 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960963 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960966 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960968 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960971 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960974 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960977 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960979 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960982 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:58:52.962186 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960985 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960987 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960990 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960993 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960995 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.960998 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961001 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961003 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961006 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961009 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961011 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961014 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961018 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961020 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961023 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961026 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961028 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961032 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961035 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961037 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:58:52.962668 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961040 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961042 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961045 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961048 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961050 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961053 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961055 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961058 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961060 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961063 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961065 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961068 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961070 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961073 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961075 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961078 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961081 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961083 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961086 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961088 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:58:52.963166 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961108 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961112 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961116 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961121 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961125 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961130 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961133 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961137 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961140 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961143 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961147 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961150 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961152 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961155 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961158 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961161 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961164 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961167 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961169 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:58:52.963647 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961172 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961174 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961177 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961179 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961182 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961185 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961187 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961190 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961192 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961194 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961197 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961199 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961202 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961205 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.961207 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962611 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962622 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962630 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962640 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962646 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962649 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:58:52.964217 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962654 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962659 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962663 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962666 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962670 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962673 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962676 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962680 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962683 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962686 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962689 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962692 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962695 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962699 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962702 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962705 2575 flags.go:64] FLAG: --config-dir="" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962708 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962711 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962715 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962718 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962721 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962725 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962728 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962731 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:58:52.964745 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962734 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962737 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962740 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962745 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962748 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962751 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962754 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962758 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962761 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962766 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962770 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962773 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962776 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962779 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962783 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962786 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962789 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962792 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962795 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962797 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962800 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962803 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962806 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962810 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962813 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 13:58:52.965341 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962821 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962824 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962828 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962832 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962835 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962838 2575 flags.go:64] FLAG: --help="false" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962841 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-130-98.ec2.internal" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962844 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962847 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962850 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962854 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962857 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962861 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962864 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962867 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962871 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962874 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962877 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962880 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962882 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962885 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962889 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962891 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962894 2575 flags.go:64] FLAG: --lock-file="" Apr 16 13:58:52.965944 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962897 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962900 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962903 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962912 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962915 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962919 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962921 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962924 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962928 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962931 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962934 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962938 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962941 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962945 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962948 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962951 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962954 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962957 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962960 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962963 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962966 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962974 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962977 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962980 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:58:52.966575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962983 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962986 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962991 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962995 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.962998 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963001 2575 flags.go:64] FLAG: --port="10250" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963004 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963006 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cc192fb088b66b9e" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963010 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963013 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963016 2575 flags.go:64] FLAG: --register-node="true" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963019 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963021 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963025 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963028 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963030 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963033 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963038 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963041 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963044 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963047 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963050 2575 flags.go:64] FLAG: --runonce="false" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963053 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963056 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963059 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:58:52.967167 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963062 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963065 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963068 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963071 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963074 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963077 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963080 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963083 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963087 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963105 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963109 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963112 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963117 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963121 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963124 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963128 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963131 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963134 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963137 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963140 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963143 2575 flags.go:64] FLAG: --v="2" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963147 2575 flags.go:64] FLAG: --version="false" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963151 2575 flags.go:64] FLAG: --vmodule="" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963156 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.963159 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:58:52.967800 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963253 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963257 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963260 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963263 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963266 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963269 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963273 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963276 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963279 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963282 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963285 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963288 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963290 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963293 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963296 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963299 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963302 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963305 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963307 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963310 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:58:52.968423 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963312 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963315 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963320 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963323 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963325 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963328 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963332 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963336 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963339 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963342 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963345 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963348 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963351 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963353 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963357 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963359 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963362 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963364 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963367 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:58:52.968909 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963370 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963372 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963374 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963377 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963383 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963386 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963388 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963391 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963394 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963396 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963399 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963402 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963405 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963407 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963410 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963414 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963417 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963419 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963422 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963425 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:58:52.969396 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963427 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963429 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963432 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963434 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963437 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963440 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963443 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963445 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963448 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963450 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963453 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963455 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963458 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963460 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963463 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963465 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963469 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963472 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963475 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963477 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:58:52.969877 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963480 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963482 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963485 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963488 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963490 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963494 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.963498 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.964016 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.970258 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.970275 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970331 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970339 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970342 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970345 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970349 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:58:52.970373 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970352 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970355 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970357 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970360 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970363 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970366 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970368 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970371 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970374 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970378 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970381 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970384 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970387 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970389 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970392 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970395 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970397 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970400 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970402 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970405 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:58:52.970748 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970407 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970410 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970413 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970415 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970418 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970420 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970424 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970427 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970430 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970432 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970435 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970437 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970440 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970442 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970446 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970448 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970451 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970454 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970456 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:58:52.971284 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970459 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970461 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970464 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970467 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970469 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970472 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970474 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970477 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970479 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970482 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970484 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970487 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970490 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970492 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970495 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970499 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970503 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970506 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970508 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:58:52.971744 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970511 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970514 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970517 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970520 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970522 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970526 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970528 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970531 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970534 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970536 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970539 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970542 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970544 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970547 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970549 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970552 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970555 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970557 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970560 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970562 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970565 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:58:52.972215 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970568 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970570 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.970575 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970675 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970679 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970682 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970686 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970688 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970691 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970694 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970697 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970699 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970702 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970711 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970714 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:58:52.972728 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970716 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970719 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970722 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970725 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970729 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970732 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970735 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970738 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970741 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970743 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970746 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970749 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970751 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970754 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970757 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970760 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970762 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970765 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970767 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970770 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:58:52.973111 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970772 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970775 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970777 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970780 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970782 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970785 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970787 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970790 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970792 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970795 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970797 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970801 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970803 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970806 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970809 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970811 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970814 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970817 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970819 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970822 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:58:52.973588 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970824 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970827 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970829 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970832 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970835 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970837 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970840 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970842 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970845 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970847 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970850 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970853 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970855 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970859 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970863 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970866 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970869 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970872 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970875 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:58:52.974064 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970878 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970881 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970883 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970886 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970889 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970892 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970894 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970897 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970899 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970903 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970905 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970907 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970910 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970912 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:52.970915 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.970919 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:58:52.974548 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.971584 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:58:52.974928 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.974062 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:58:52.974928 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.974871 2575 server.go:1019] "Starting client certificate rotation" Apr 16 13:58:52.974986 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.974961 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:58:52.975016 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.975002 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:58:52.996731 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.996712 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:58:52.999065 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:52.999046 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:58:53.015964 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.015943 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:58:53.022451 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.022434 2575 log.go:25] "Validated CRI v1 image API" Apr 16 13:58:53.024385 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.024367 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:58:53.027522 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.027500 2575 fs.go:135] Filesystem UUIDs: map[382eeb73-cfe3-42c8-bc4d-f6f996c00ca4:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 883ec744-4b3d-48c1-b213-de374c67376b:/dev/nvme0n1p4] Apr 16 13:58:53.027599 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.027521 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:58:53.029353 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.029338 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:58:53.033515 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.033411 2575 manager.go:217] Machine: {Timestamp:2026-04-16 13:58:53.031243247 +0000 UTC m=+0.367343061 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100180 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a3700ac404acce25b8f6365d53b24 SystemUUID:ec2a3700-ac40-4acc-e25b-8f6365d53b24 BootID:907b3c9b-922f-4e1f-b440-beab78ecebda Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3a:36:48:ba:d5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3a:36:48:ba:d5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2e:3d:fa:4d:df:86 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:58:53.033515 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.033515 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:58:53.033616 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.033592 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:58:53.035433 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.035413 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:58:53.035568 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.035437 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-98.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:58:53.035612 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.035577 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:58:53.035612 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.035586 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:58:53.035612 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.035600 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:58:53.037309 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.037298 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:58:53.039350 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.039340 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:58:53.039458 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.039450 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:58:53.042114 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.042105 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:58:53.042151 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.042123 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:58:53.042151 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.042141 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:58:53.042151 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.042150 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:58:53.042229 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.042158 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:58:53.043319 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.043308 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:58:53.043356 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.043326 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:58:53.046239 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.046223 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t5mv5" Apr 16 13:58:53.046621 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.046598 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:58:53.049057 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.049044 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:58:53.052880 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.052852 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:58:53.052963 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.052890 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-98.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:58:53.053495 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053483 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:58:53.053541 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053501 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:58:53.053541 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053508 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:58:53.053541 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053513 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:58:53.053541 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053520 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:58:53.053541 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053525 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:58:53.053541 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053532 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:58:53.053755 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053547 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:58:53.053755 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053554 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:58:53.053755 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053560 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:58:53.053755 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053598 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:58:53.053755 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053607 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:58:53.053755 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.053725 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t5mv5" Apr 16 13:58:53.055896 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.055885 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:58:53.055896 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.055896 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:58:53.059336 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.059324 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:58:53.059380 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.059359 2575 server.go:1295] "Started kubelet" Apr 16 13:58:53.059474 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.059451 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:58:53.059531 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.059446 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:58:53.059531 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.059504 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:58:53.060332 ip-10-0-130-98 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:58:53.060790 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.060528 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:58:53.066486 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.066464 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:58:53.069526 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.069505 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:58:53.069822 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.069803 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:58:53.069918 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.069821 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:58:53.070943 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.070767 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 16 13:58:53.071011 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.070775 2575 factory.go:55] Registering systemd factory Apr 16 13:58:53.071011 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.070958 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:58:53.071164 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.071150 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:58:53.071228 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.071214 2575 factory.go:153] Registering CRI-O factory Apr 16 13:58:53.071228 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.071166 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:58:53.071313 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.071230 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 13:58:53.071313 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.071220 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:58:53.071381 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.071334 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:58:53.071381 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.071357 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:58:53.071381 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.071361 2575 factory.go:103] Registering Raw factory Apr 16 13:58:53.071381 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.071366 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:58:53.071381 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.071376 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 13:58:53.071739 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.071723 2575 manager.go:319] Starting recovery of all containers Apr 16 13:58:53.072028 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.072009 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:53.073897 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.073879 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-98.ec2.internal" not found Apr 16 13:58:53.075858 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.075833 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-98.ec2.internal\" not found" node="ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.081202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.081187 2575 manager.go:324] Recovery completed Apr 16 13:58:53.085257 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.085245 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:58:53.088323 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.088309 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:58:53.088399 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.088333 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:58:53.088399 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.088346 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:58:53.088771 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.088755 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:58:53.088771 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.088768 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:58:53.088880 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.088786 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:58:53.088977 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.088965 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-98.ec2.internal" not found Apr 16 13:58:53.092908 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.092893 2575 policy_none.go:49] "None policy: Start" Apr 16 13:58:53.092985 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.092914 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:58:53.092985 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.092928 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:58:53.147584 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.129635 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 13:58:53.147584 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.129668 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:58:53.147584 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.129678 2575 server.go:85] "Starting device plugin registration server" Apr 16 13:58:53.147584 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.129889 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:58:53.147584 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.129901 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:58:53.147584 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.129978 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:58:53.147584 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.130051 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:58:53.147584 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.130058 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:58:53.147584 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.131162 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:58:53.147584 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.131193 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-98.ec2.internal\" not found" Apr 16 13:58:53.147912 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.147885 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-98.ec2.internal" not found Apr 16 13:58:53.200349 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.200290 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:58:53.201479 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.201457 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:58:53.201576 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.201486 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:58:53.201576 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.201509 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:58:53.201576 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.201519 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:58:53.201576 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.201556 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:58:53.206891 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.206871 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:53.230527 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.230508 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:58:53.231452 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.231436 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:58:53.231501 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.231467 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:58:53.231501 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.231480 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:58:53.231575 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.231502 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.239083 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.239069 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.239137 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.239089 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-98.ec2.internal\": node \"ip-10-0-130-98.ec2.internal\" not found" Apr 16 13:58:53.250021 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.250000 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 16 13:58:53.302003 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.301959 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal"] Apr 16 13:58:53.302068 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.302035 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:58:53.303103 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.303075 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:58:53.303153 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.303116 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:58:53.303153 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.303126 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:58:53.305419 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.305408 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:58:53.305552 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.305537 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.305602 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.305566 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:58:53.306107 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.306077 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:58:53.306177 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.306128 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:58:53.306177 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.306144 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:58:53.306245 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.306081 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:58:53.306245 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.306200 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:58:53.306245 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.306214 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:58:53.308863 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.308845 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.308941 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.308876 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:58:53.309549 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.309530 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:58:53.309646 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.309557 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:58:53.309646 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.309568 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:58:53.330717 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.330696 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-98.ec2.internal\" not found" node="ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.335026 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.335011 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-98.ec2.internal\" not found" node="ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.350685 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.350661 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 16 13:58:53.373058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.373027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/067b94d6835fdbad399c48af24ac5253-config\") pod \"kube-apiserver-proxy-ip-10-0-130-98.ec2.internal\" (UID: \"067b94d6835fdbad399c48af24ac5253\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.373153 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.373060 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/04f07c727e9c02d9599f1c80512fcf38-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"04f07c727e9c02d9599f1c80512fcf38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.373153 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.373089 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04f07c727e9c02d9599f1c80512fcf38-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"04f07c727e9c02d9599f1c80512fcf38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.451627 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.451554 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 16 13:58:53.473972 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.473951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/067b94d6835fdbad399c48af24ac5253-config\") pod \"kube-apiserver-proxy-ip-10-0-130-98.ec2.internal\" (UID: \"067b94d6835fdbad399c48af24ac5253\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.474044 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.473976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/04f07c727e9c02d9599f1c80512fcf38-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"04f07c727e9c02d9599f1c80512fcf38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.474044 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.473993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04f07c727e9c02d9599f1c80512fcf38-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"04f07c727e9c02d9599f1c80512fcf38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.474044 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.474030 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04f07c727e9c02d9599f1c80512fcf38-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"04f07c727e9c02d9599f1c80512fcf38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.474157 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.474059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/067b94d6835fdbad399c48af24ac5253-config\") pod \"kube-apiserver-proxy-ip-10-0-130-98.ec2.internal\" (UID: \"067b94d6835fdbad399c48af24ac5253\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.474157 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.474067 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/04f07c727e9c02d9599f1c80512fcf38-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"04f07c727e9c02d9599f1c80512fcf38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.552394 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.552358 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 16 13:58:53.633886 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.633862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.638240 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.638221 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.652601 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.652581 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 16 13:58:53.753212 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.753138 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 16 13:58:53.853701 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.853671 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 16 13:58:53.913480 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.913458 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:53.941164 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.941143 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:53.970908 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.970878 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.975090 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.975076 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:58:53.975220 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.975204 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:58:53.975280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.975215 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:58:53.975280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.975223 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:58:53.975280 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:53.975238 2575 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a204c63c2fbb3419fa8f72c3b490ac3f-d6beaa4520e874b8.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.130.98:48170->54.173.236.44:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.975280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.975261 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 16 13:58:53.975395 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.975249 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:58:53.991249 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:53.991231 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:58:54.042550 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.042498 2575 apiserver.go:52] "Watching apiserver" Apr 16 13:58:54.049587 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.049569 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:58:54.049948 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.049930 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-8wwv4","openshift-dns/node-resolver-9gkrd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal","openshift-multus/multus-additional-cni-plugins-g8chm","openshift-multus/multus-rts4v","openshift-multus/network-metrics-daemon-ckxrz","kube-system/konnectivity-agent-mt2qp","kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal","openshift-image-registry/node-ca-54qjm","openshift-network-diagnostics/network-check-target-xtl4r","openshift-network-operator/iptables-alerter-r8xw2","openshift-ovn-kubernetes/ovnkube-node-cdqzf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq"] Apr 16 13:58:54.054857 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.054837 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.054947 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.054932 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9gkrd" Apr 16 13:58:54.056921 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.056895 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:53:53 +0000 UTC" deadline="2027-12-31 10:52:26.176324718 +0000 UTC" Apr 16 13:58:54.056996 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.056921 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14972h53m32.119406635s" Apr 16 13:58:54.057048 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.057033 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.057364 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.057345 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-76pmw\"" Apr 16 13:58:54.057427 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.057378 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:58:54.057471 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.057378 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:58:54.057471 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.057461 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-9rq89\"" Apr 16 13:58:54.057536 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.057494 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:58:54.057593 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.057581 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:58:54.060358 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.060334 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:58:54.060663 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.060642 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:58:54.060731 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.060667 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:58:54.060731 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.060686 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.060731 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.060706 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:58:54.060731 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.060713 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hwxl9\"" Apr 16 13:58:54.060916 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.060852 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:58:54.062712 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.062696 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jb5ss\"" Apr 16 13:58:54.062791 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.062712 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:58:54.063013 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.063001 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:58:54.063076 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.063061 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:58:54.065155 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.065142 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:58:54.067181 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.067161 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:58:54.067280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.067184 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:58:54.067280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.067261 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bpxb7\"" Apr 16 13:58:54.069429 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.069412 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-54qjm" Apr 16 13:58:54.069513 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.069490 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:58:54.069572 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.069531 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:58:54.069900 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.069887 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:58:54.072145 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.072125 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:58:54.072251 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.072237 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6gpk8\"" Apr 16 13:58:54.072303 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.072241 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:58:54.072303 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.072247 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:58:54.074250 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.074232 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r8xw2" Apr 16 13:58:54.076512 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8gmv\" (UniqueName: \"kubernetes.io/projected/4c1f6aa1-4340-4463-8bc6-2ce795e54be0-kube-api-access-b8gmv\") pod \"node-ca-54qjm\" (UID: \"4c1f6aa1-4340-4463-8bc6-2ce795e54be0\") " pod="openshift-image-registry/node-ca-54qjm" Apr 16 13:58:54.076599 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076526 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z42w\" (UniqueName: \"kubernetes.io/projected/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-kube-api-access-6z42w\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.076599 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076553 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:58:54.076599 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076565 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:58:54.076599 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c1f6aa1-4340-4463-8bc6-2ce795e54be0-serviceca\") pod \"node-ca-54qjm\" (UID: \"4c1f6aa1-4340-4463-8bc6-2ce795e54be0\") " pod="openshift-image-registry/node-ca-54qjm" Apr 16 13:58:54.076599 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076595 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pwlhx\"" Apr 16 13:58:54.076840 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076602 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:58:54.076840 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-sysctl-conf\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.076840 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076630 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-host\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.076840 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9cd8c72-8367-4db4-9cb0-b52863ecee83-hosts-file\") pod \"node-resolver-9gkrd\" (UID: \"e9cd8c72-8367-4db4-9cb0-b52863ecee83\") " pod="openshift-dns/node-resolver-9gkrd" Apr 16 13:58:54.076840 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076678 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-system-cni-dir\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.076840 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076724 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f6c2db70-a008-4b8a-b25c-881c2d8e9809-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.076840 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-run-k8s-cni-cncf-io\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.076840 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-etc-kubernetes\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.076840 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076836 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-modprobe-d\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-sysctl-d\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076913 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-run\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f6c2db70-a008-4b8a-b25c-881c2d8e9809-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/50856c3f-d1b0-4ecc-9979-f3d585acf87b-agent-certs\") pod \"konnectivity-agent-mt2qp\" (UID: \"50856c3f-d1b0-4ecc-9979-f3d585acf87b\") " pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.076982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-cni-dir\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077000 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-run-netns\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-var-lib-cni-multus\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077033 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-daemon-config\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6blns\" (UniqueName: \"kubernetes.io/projected/60a323f7-d2f9-401b-bb9d-cde57adebc7d-kube-api-access-6blns\") pod \"iptables-alerter-r8xw2\" (UID: \"60a323f7-d2f9-401b-bb9d-cde57adebc7d\") " pod="openshift-network-operator/iptables-alerter-r8xw2" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-kubernetes\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-systemd\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-tuned\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077153 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-var-lib-cni-bin\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077177 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-socket-dir-parent\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.077202 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077192 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4szfz\" (UniqueName: \"kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz\") pod \"network-check-target-xtl4r\" (UID: \"008eaa56-e238-402c-a6f7-ded4fd1a1572\") " pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-var-lib-kubelet\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077249 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e9cd8c72-8367-4db4-9cb0-b52863ecee83-tmp-dir\") pod \"node-resolver-9gkrd\" (UID: \"e9cd8c72-8367-4db4-9cb0-b52863ecee83\") " pod="openshift-dns/node-resolver-9gkrd" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjsqm\" (UniqueName: \"kubernetes.io/projected/e9cd8c72-8367-4db4-9cb0-b52863ecee83-kube-api-access-tjsqm\") pod \"node-resolver-9gkrd\" (UID: \"e9cd8c72-8367-4db4-9cb0-b52863ecee83\") " pod="openshift-dns/node-resolver-9gkrd" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-sys\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077327 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-lib-modules\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n696l\" (UniqueName: \"kubernetes.io/projected/a9cea359-15c5-46e5-af4f-1f2198fc5b08-kube-api-access-n696l\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6c2db70-a008-4b8a-b25c-881c2d8e9809-cni-binary-copy\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077398 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077406 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5jcq\" (UniqueName: \"kubernetes.io/projected/f6c2db70-a008-4b8a-b25c-881c2d8e9809-kube-api-access-x5jcq\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-cni-binary-copy\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-conf-dir\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/60a323f7-d2f9-401b-bb9d-cde57adebc7d-iptables-alerter-script\") pod \"iptables-alerter-r8xw2\" (UID: \"60a323f7-d2f9-401b-bb9d-cde57adebc7d\") " pod="openshift-network-operator/iptables-alerter-r8xw2" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60a323f7-d2f9-401b-bb9d-cde57adebc7d-host-slash\") pod \"iptables-alerter-r8xw2\" (UID: \"60a323f7-d2f9-401b-bb9d-cde57adebc7d\") " pod="openshift-network-operator/iptables-alerter-r8xw2" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-cnibin\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-os-release\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-var-lib-kubelet\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.077914 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077629 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-sysconfig\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.078786 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9cea359-15c5-46e5-af4f-1f2198fc5b08-tmp\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.078786 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-os-release\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.078786 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077703 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.078786 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/50856c3f-d1b0-4ecc-9979-f3d585acf87b-konnectivity-ca\") pod \"konnectivity-agent-mt2qp\" (UID: \"50856c3f-d1b0-4ecc-9979-f3d585acf87b\") " pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:58:54.078786 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077788 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-system-cni-dir\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.078786 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-cnibin\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.078786 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-hostroot\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.078786 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077872 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-run-multus-certs\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.078786 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7szc\" (UniqueName: \"kubernetes.io/projected/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-kube-api-access-r7szc\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:58:54.078786 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.077921 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c1f6aa1-4340-4463-8bc6-2ce795e54be0-host\") pod \"node-ca-54qjm\" (UID: \"4c1f6aa1-4340-4463-8bc6-2ce795e54be0\") " pod="openshift-image-registry/node-ca-54qjm" Apr 16 13:58:54.079303 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.079289 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.079674 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.079655 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:58:54.079750 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.079694 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:58:54.079921 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.079903 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jp2kq\"" Apr 16 13:58:54.079986 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.079931 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:58:54.080579 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.080565 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:58:54.080631 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.080589 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:58:54.080631 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.080613 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:58:54.081338 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.081323 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:58:54.081408 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.081330 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:58:54.081839 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.081826 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gcj25\"" Apr 16 13:58:54.081881 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.081831 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:58:54.084766 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.084752 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:58:54.103540 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.103516 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wkfnz" Apr 16 13:58:54.110909 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.110894 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wkfnz" Apr 16 13:58:54.172120 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.172084 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:58:54.178529 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-sysctl-conf\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.178606 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-system-cni-dir\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.178606 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-registration-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.178606 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z42w\" (UniqueName: \"kubernetes.io/projected/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-kube-api-access-6z42w\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.178606 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c1f6aa1-4340-4463-8bc6-2ce795e54be0-serviceca\") pod \"node-ca-54qjm\" (UID: \"4c1f6aa1-4340-4463-8bc6-2ce795e54be0\") " pod="openshift-image-registry/node-ca-54qjm" Apr 16 13:58:54.178746 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178631 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-system-cni-dir\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.178746 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-systemd-units\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.178746 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-run-netns\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.178746 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-sysctl-conf\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.178746 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-run-openvswitch\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.178746 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-node-log\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.178746 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-cni-bin\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.178746 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-cni-netd\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-run-k8s-cni-cncf-io\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-run-k8s-cni-cncf-io\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.178861 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178865 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-run\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f6c2db70-a008-4b8a-b25c-881c2d8e9809-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.178916 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs podName:0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e nodeName:}" failed. No retries permitted until 2026-04-16 13:58:54.678890435 +0000 UTC m=+2.014990235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs") pod "network-metrics-daemon-ckxrz" (UID: "0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178937 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-run\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/50856c3f-d1b0-4ecc-9979-f3d585acf87b-agent-certs\") pod \"konnectivity-agent-mt2qp\" (UID: \"50856c3f-d1b0-4ecc-9979-f3d585acf87b\") " pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.178980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-etc-openvswitch\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179005 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-etc-selinux\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.179058 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-cni-dir\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-var-lib-cni-multus\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-daemon-config\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179146 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-var-lib-cni-multus\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-systemd\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-cni-dir\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjsqm\" (UniqueName: \"kubernetes.io/projected/e9cd8c72-8367-4db4-9cb0-b52863ecee83-kube-api-access-tjsqm\") pod \"node-resolver-9gkrd\" (UID: \"e9cd8c72-8367-4db4-9cb0-b52863ecee83\") " pod="openshift-dns/node-resolver-9gkrd" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179191 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-systemd\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-var-lib-cni-bin\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-var-lib-kubelet\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179370 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z94r\" (UniqueName: \"kubernetes.io/projected/b8beb800-9073-4fd9-81db-5015390a964e-kube-api-access-9z94r\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-var-lib-cni-bin\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179397 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-sys\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179422 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-kubelet\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-log-socket\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-conf-dir\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.179659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179498 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2aa2b87-716c-4b0a-abde-a69d0c373e83-ovnkube-config\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f6c2db70-a008-4b8a-b25c-881c2d8e9809-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c1f6aa1-4340-4463-8bc6-2ce795e54be0-serviceca\") pod \"node-ca-54qjm\" (UID: \"4c1f6aa1-4340-4463-8bc6-2ce795e54be0\") " pod="openshift-image-registry/node-ca-54qjm" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-socket-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-os-release\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9cea359-15c5-46e5-af4f-1f2198fc5b08-tmp\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-os-release\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-sys\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-conf-dir\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-var-lib-kubelet\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-system-cni-dir\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-cnibin\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-system-cni-dir\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-hostroot\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-cnibin\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7szc\" (UniqueName: \"kubernetes.io/projected/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-kube-api-access-r7szc\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-daemon-config\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c1f6aa1-4340-4463-8bc6-2ce795e54be0-host\") pod \"node-ca-54qjm\" (UID: \"4c1f6aa1-4340-4463-8bc6-2ce795e54be0\") " pod="openshift-image-registry/node-ca-54qjm" Apr 16 13:58:54.180462 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179853 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-hostroot\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8gmv\" (UniqueName: \"kubernetes.io/projected/4c1f6aa1-4340-4463-8bc6-2ce795e54be0-kube-api-access-b8gmv\") pod \"node-ca-54qjm\" (UID: \"4c1f6aa1-4340-4463-8bc6-2ce795e54be0\") " pod="openshift-image-registry/node-ca-54qjm" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179945 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c1f6aa1-4340-4463-8bc6-2ce795e54be0-host\") pod \"node-ca-54qjm\" (UID: \"4c1f6aa1-4340-4463-8bc6-2ce795e54be0\") " pod="openshift-image-registry/node-ca-54qjm" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.179984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9cd8c72-8367-4db4-9cb0-b52863ecee83-hosts-file\") pod \"node-resolver-9gkrd\" (UID: \"e9cd8c72-8367-4db4-9cb0-b52863ecee83\") " pod="openshift-dns/node-resolver-9gkrd" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9cd8c72-8367-4db4-9cb0-b52863ecee83-hosts-file\") pod \"node-resolver-9gkrd\" (UID: \"e9cd8c72-8367-4db4-9cb0-b52863ecee83\") " pod="openshift-dns/node-resolver-9gkrd" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f6c2db70-a008-4b8a-b25c-881c2d8e9809-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-host\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-var-lib-openvswitch\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180238 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-run-ovn\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-run-ovn-kubernetes\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-host\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180306 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2aa2b87-716c-4b0a-abde-a69d0c373e83-ovnkube-script-lib\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-etc-kubernetes\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-modprobe-d\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-sysctl-d\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180413 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-sys-fs\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-run-netns\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.181280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-etc-kubernetes\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6blns\" (UniqueName: \"kubernetes.io/projected/60a323f7-d2f9-401b-bb9d-cde57adebc7d-kube-api-access-6blns\") pod \"iptables-alerter-r8xw2\" (UID: \"60a323f7-d2f9-401b-bb9d-cde57adebc7d\") " pod="openshift-network-operator/iptables-alerter-r8xw2" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180486 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-modprobe-d\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180506 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-sysctl-d\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-run-netns\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-kubernetes\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180551 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f6c2db70-a008-4b8a-b25c-881c2d8e9809-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180577 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-tuned\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-kubernetes\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e9cd8c72-8367-4db4-9cb0-b52863ecee83-tmp-dir\") pod \"node-resolver-9gkrd\" (UID: \"e9cd8c72-8367-4db4-9cb0-b52863ecee83\") " pod="openshift-dns/node-resolver-9gkrd" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-cnibin\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180706 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-socket-dir-parent\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4szfz\" (UniqueName: \"kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz\") pod \"network-check-target-xtl4r\" (UID: \"008eaa56-e238-402c-a6f7-ded4fd1a1572\") " pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-cnibin\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180829 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-multus-socket-dir-parent\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e9cd8c72-8367-4db4-9cb0-b52863ecee83-tmp-dir\") pod \"node-resolver-9gkrd\" (UID: \"e9cd8c72-8367-4db4-9cb0-b52863ecee83\") " pod="openshift-dns/node-resolver-9gkrd" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6c2db70-a008-4b8a-b25c-881c2d8e9809-cni-binary-copy\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.182005 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-slash\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-run-systemd\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.180986 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-device-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-lib-modules\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n696l\" (UniqueName: \"kubernetes.io/projected/a9cea359-15c5-46e5-af4f-1f2198fc5b08-kube-api-access-n696l\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-lib-modules\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5jcq\" (UniqueName: \"kubernetes.io/projected/f6c2db70-a008-4b8a-b25c-881c2d8e9809-kube-api-access-x5jcq\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpfk8\" (UniqueName: \"kubernetes.io/projected/a2aa2b87-716c-4b0a-abde-a69d0c373e83-kube-api-access-vpfk8\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-cni-binary-copy\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/60a323f7-d2f9-401b-bb9d-cde57adebc7d-iptables-alerter-script\") pod \"iptables-alerter-r8xw2\" (UID: \"60a323f7-d2f9-401b-bb9d-cde57adebc7d\") " pod="openshift-network-operator/iptables-alerter-r8xw2" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181266 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60a323f7-d2f9-401b-bb9d-cde57adebc7d-host-slash\") pod \"iptables-alerter-r8xw2\" (UID: \"60a323f7-d2f9-401b-bb9d-cde57adebc7d\") " pod="openshift-network-operator/iptables-alerter-r8xw2" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2aa2b87-716c-4b0a-abde-a69d0c373e83-ovn-node-metrics-cert\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-var-lib-kubelet\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-sysconfig\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-os-release\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181377 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6c2db70-a008-4b8a-b25c-881c2d8e9809-cni-binary-copy\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181401 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.182676 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181429 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/50856c3f-d1b0-4ecc-9979-f3d585acf87b-konnectivity-ca\") pod \"konnectivity-agent-mt2qp\" (UID: \"50856c3f-d1b0-4ecc-9979-f3d585acf87b\") " pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181456 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2aa2b87-716c-4b0a-abde-a69d0c373e83-env-overrides\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181460 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-var-lib-kubelet\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-run-multus-certs\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60a323f7-d2f9-401b-bb9d-cde57adebc7d-host-slash\") pod \"iptables-alerter-r8xw2\" (UID: \"60a323f7-d2f9-401b-bb9d-cde57adebc7d\") " pod="openshift-network-operator/iptables-alerter-r8xw2" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-os-release\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181603 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-sysconfig\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-cni-binary-copy\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.181776 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-host-run-multus-certs\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.182110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/50856c3f-d1b0-4ecc-9979-f3d585acf87b-konnectivity-ca\") pod \"konnectivity-agent-mt2qp\" (UID: \"50856c3f-d1b0-4ecc-9979-f3d585acf87b\") " pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.182149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/60a323f7-d2f9-401b-bb9d-cde57adebc7d-iptables-alerter-script\") pod \"iptables-alerter-r8xw2\" (UID: \"60a323f7-d2f9-401b-bb9d-cde57adebc7d\") " pod="openshift-network-operator/iptables-alerter-r8xw2" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.182180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2db70-a008-4b8a-b25c-881c2d8e9809-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.182637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9cea359-15c5-46e5-af4f-1f2198fc5b08-tmp\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.182762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/50856c3f-d1b0-4ecc-9979-f3d585acf87b-agent-certs\") pod \"konnectivity-agent-mt2qp\" (UID: \"50856c3f-d1b0-4ecc-9979-f3d585acf87b\") " pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:58:54.183261 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.182780 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a9cea359-15c5-46e5-af4f-1f2198fc5b08-etc-tuned\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.185392 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.185373 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:58:54.185498 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.185396 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:58:54.185498 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.185414 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4szfz for pod openshift-network-diagnostics/network-check-target-xtl4r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:54.185498 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.185492 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz podName:008eaa56-e238-402c-a6f7-ded4fd1a1572 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:54.685474453 +0000 UTC m=+2.021574267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4szfz" (UniqueName: "kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz") pod "network-check-target-xtl4r" (UID: "008eaa56-e238-402c-a6f7-ded4fd1a1572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:54.186818 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.186794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z42w\" (UniqueName: \"kubernetes.io/projected/553f6f6d-061c-4a9d-9ef4-1bbfedfede51-kube-api-access-6z42w\") pod \"multus-rts4v\" (UID: \"553f6f6d-061c-4a9d-9ef4-1bbfedfede51\") " pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.187184 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.187164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjsqm\" (UniqueName: \"kubernetes.io/projected/e9cd8c72-8367-4db4-9cb0-b52863ecee83-kube-api-access-tjsqm\") pod \"node-resolver-9gkrd\" (UID: \"e9cd8c72-8367-4db4-9cb0-b52863ecee83\") " pod="openshift-dns/node-resolver-9gkrd" Apr 16 13:58:54.188114 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.188073 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6blns\" (UniqueName: \"kubernetes.io/projected/60a323f7-d2f9-401b-bb9d-cde57adebc7d-kube-api-access-6blns\") pod \"iptables-alerter-r8xw2\" (UID: \"60a323f7-d2f9-401b-bb9d-cde57adebc7d\") " pod="openshift-network-operator/iptables-alerter-r8xw2" Apr 16 13:58:54.188285 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.188270 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8gmv\" (UniqueName: \"kubernetes.io/projected/4c1f6aa1-4340-4463-8bc6-2ce795e54be0-kube-api-access-b8gmv\") pod \"node-ca-54qjm\" (UID: \"4c1f6aa1-4340-4463-8bc6-2ce795e54be0\") " pod="openshift-image-registry/node-ca-54qjm" Apr 16 13:58:54.188348 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.188322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7szc\" (UniqueName: \"kubernetes.io/projected/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-kube-api-access-r7szc\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:58:54.188465 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.188445 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n696l\" (UniqueName: \"kubernetes.io/projected/a9cea359-15c5-46e5-af4f-1f2198fc5b08-kube-api-access-n696l\") pod \"tuned-8wwv4\" (UID: \"a9cea359-15c5-46e5-af4f-1f2198fc5b08\") " pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.188740 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.188725 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5jcq\" (UniqueName: \"kubernetes.io/projected/f6c2db70-a008-4b8a-b25c-881c2d8e9809-kube-api-access-x5jcq\") pod \"multus-additional-cni-plugins-g8chm\" (UID: \"f6c2db70-a008-4b8a-b25c-881c2d8e9809\") " pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.281924 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.281887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2aa2b87-716c-4b0a-abde-a69d0c373e83-ovnkube-config\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.281924 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.281931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-socket-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.281960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-var-lib-openvswitch\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.281982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-run-ovn\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282003 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-run-ovn-kubernetes\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2aa2b87-716c-4b0a-abde-a69d0c373e83-ovnkube-script-lib\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-sys-fs\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-socket-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282086 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-run-ovn\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-slash\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-var-lib-openvswitch\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-run-ovn-kubernetes\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282149 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-run-systemd\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282166 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-slash\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-sys-fs\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-device-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.282188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-run-systemd\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpfk8\" (UniqueName: \"kubernetes.io/projected/a2aa2b87-716c-4b0a-abde-a69d0c373e83-kube-api-access-vpfk8\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2aa2b87-716c-4b0a-abde-a69d0c373e83-ovn-node-metrics-cert\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282296 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2aa2b87-716c-4b0a-abde-a69d0c373e83-env-overrides\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-registration-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-registration-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2aa2b87-716c-4b0a-abde-a69d0c373e83-ovnkube-config\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-device-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-systemd-units\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282628 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2aa2b87-716c-4b0a-abde-a69d0c373e83-ovnkube-script-lib\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-run-netns\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-run-openvswitch\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282679 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-systemd-units\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-node-log\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-run-netns\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-run-openvswitch\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-cni-bin\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.282871 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-node-log\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-cni-bin\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-cni-netd\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2aa2b87-716c-4b0a-abde-a69d0c373e83-env-overrides\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-cni-netd\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-etc-openvswitch\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-etc-openvswitch\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-etc-selinux\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282892 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9z94r\" (UniqueName: \"kubernetes.io/projected/b8beb800-9073-4fd9-81db-5015390a964e-kube-api-access-9z94r\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282945 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b8beb800-9073-4fd9-81db-5015390a964e-etc-selinux\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-kubelet\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.282989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-log-socket\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.283021 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-host-kubelet\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.283469 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.283041 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2aa2b87-716c-4b0a-abde-a69d0c373e83-log-socket\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.284380 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.284365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2aa2b87-716c-4b0a-abde-a69d0c373e83-ovn-node-metrics-cert\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.290147 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.290131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z94r\" (UniqueName: \"kubernetes.io/projected/b8beb800-9073-4fd9-81db-5015390a964e-kube-api-access-9z94r\") pod \"aws-ebs-csi-driver-node-m77hq\" (UID: \"b8beb800-9073-4fd9-81db-5015390a964e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.290281 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.290230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpfk8\" (UniqueName: \"kubernetes.io/projected/a2aa2b87-716c-4b0a-abde-a69d0c373e83-kube-api-access-vpfk8\") pod \"ovnkube-node-cdqzf\" (UID: \"a2aa2b87-716c-4b0a-abde-a69d0c373e83\") " pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.378190 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.378168 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" Apr 16 13:58:54.390077 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.390055 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9gkrd" Apr 16 13:58:54.397708 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.397689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g8chm" Apr 16 13:58:54.430235 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.430213 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rts4v" Apr 16 13:58:54.445766 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.445735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:58:54.460628 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.460614 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-54qjm" Apr 16 13:58:54.475143 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.475129 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r8xw2" Apr 16 13:58:54.499916 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.499898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:58:54.505260 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.505242 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" Apr 16 13:58:54.662631 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:54.662596 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04f07c727e9c02d9599f1c80512fcf38.slice/crio-2d938e7136c558b5d3b48e74662a49c88eddf9367cc6dba04c64b5e2287757c5 WatchSource:0}: Error finding container 2d938e7136c558b5d3b48e74662a49c88eddf9367cc6dba04c64b5e2287757c5: Status 404 returned error can't find the container with id 2d938e7136c558b5d3b48e74662a49c88eddf9367cc6dba04c64b5e2287757c5 Apr 16 13:58:54.663127 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:54.663089 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod067b94d6835fdbad399c48af24ac5253.slice/crio-904fe019c29a1881442fa98422d273b81fee5cccba3e937604868ddef7b24694 WatchSource:0}: Error finding container 904fe019c29a1881442fa98422d273b81fee5cccba3e937604868ddef7b24694: Status 404 returned error can't find the container with id 904fe019c29a1881442fa98422d273b81fee5cccba3e937604868ddef7b24694 Apr 16 13:58:54.667486 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.667472 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:58:54.685835 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.685810 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4szfz\" (UniqueName: \"kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz\") pod \"network-check-target-xtl4r\" (UID: \"008eaa56-e238-402c-a6f7-ded4fd1a1572\") " pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:58:54.685917 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.685847 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:58:54.685985 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.685967 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:58:54.686027 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.685993 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:58:54.686027 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.686005 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4szfz for pod openshift-network-diagnostics/network-check-target-xtl4r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:54.686027 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.686006 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:54.686142 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.686059 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs podName:0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e nodeName:}" failed. No retries permitted until 2026-04-16 13:58:55.686043443 +0000 UTC m=+3.022143246 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs") pod "network-metrics-daemon-ckxrz" (UID: "0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:54.686142 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:54.686072 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz podName:008eaa56-e238-402c-a6f7-ded4fd1a1572 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:55.686066905 +0000 UTC m=+3.022166711 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4szfz" (UniqueName: "kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz") pod "network-check-target-xtl4r" (UID: "008eaa56-e238-402c-a6f7-ded4fd1a1572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:54.811133 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:54.811104 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60a323f7_d2f9_401b_bb9d_cde57adebc7d.slice/crio-9fc5441e5a978ae0cca34951ededce360a36867d8410d65779038d989db1c2fb WatchSource:0}: Error finding container 9fc5441e5a978ae0cca34951ededce360a36867d8410d65779038d989db1c2fb: Status 404 returned error can't find the container with id 9fc5441e5a978ae0cca34951ededce360a36867d8410d65779038d989db1c2fb Apr 16 13:58:54.819617 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:54.819595 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:54.852177 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:54.852148 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9cd8c72_8367_4db4_9cb0_b52863ecee83.slice/crio-6539f3a67765dd0fb43da179186b7a060dfd3114a20035bf1c13dce79b494de7 WatchSource:0}: Error finding container 6539f3a67765dd0fb43da179186b7a060dfd3114a20035bf1c13dce79b494de7: Status 404 returned error can't find the container with id 6539f3a67765dd0fb43da179186b7a060dfd3114a20035bf1c13dce79b494de7 Apr 16 13:58:54.870464 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:54.870440 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod553f6f6d_061c_4a9d_9ef4_1bbfedfede51.slice/crio-b6c34d09ddb0a995e09c1c4286402feb4a3f51e5135464dc678d1a6e4ea720c7 WatchSource:0}: Error finding container b6c34d09ddb0a995e09c1c4286402feb4a3f51e5135464dc678d1a6e4ea720c7: Status 404 returned error can't find the container with id b6c34d09ddb0a995e09c1c4286402feb4a3f51e5135464dc678d1a6e4ea720c7 Apr 16 13:58:54.875041 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:54.875023 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8beb800_9073_4fd9_81db_5015390a964e.slice/crio-5aeb82c34c06cd8da54608948edcb9e3d51df7b2b44b371ac64d87010e14cbc8 WatchSource:0}: Error finding container 5aeb82c34c06cd8da54608948edcb9e3d51df7b2b44b371ac64d87010e14cbc8: Status 404 returned error can't find the container with id 5aeb82c34c06cd8da54608948edcb9e3d51df7b2b44b371ac64d87010e14cbc8 Apr 16 13:58:54.875566 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:54.875542 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50856c3f_d1b0_4ecc_9979_f3d585acf87b.slice/crio-aa9b262831dc39279254133f0192faa43024b0fc17d5997c6b7dd52497a355fc WatchSource:0}: Error finding container aa9b262831dc39279254133f0192faa43024b0fc17d5997c6b7dd52497a355fc: Status 404 returned error can't find the container with id aa9b262831dc39279254133f0192faa43024b0fc17d5997c6b7dd52497a355fc Apr 16 13:58:54.897197 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:54.897179 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c2db70_a008_4b8a_b25c_881c2d8e9809.slice/crio-108654a2e544da2e1aa040ac22aa0da19306cea39a06b9386d01fbcb5b0f80f9 WatchSource:0}: Error finding container 108654a2e544da2e1aa040ac22aa0da19306cea39a06b9386d01fbcb5b0f80f9: Status 404 returned error can't find the container with id 108654a2e544da2e1aa040ac22aa0da19306cea39a06b9386d01fbcb5b0f80f9 Apr 16 13:58:54.922784 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:54.922764 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2aa2b87_716c_4b0a_abde_a69d0c373e83.slice/crio-256dd18e0983430427cd9ca30f93a5acebfa44187510a24ce8d117078d194727 WatchSource:0}: Error finding container 256dd18e0983430427cd9ca30f93a5acebfa44187510a24ce8d117078d194727: Status 404 returned error can't find the container with id 256dd18e0983430427cd9ca30f93a5acebfa44187510a24ce8d117078d194727 Apr 16 13:58:55.112923 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.112887 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:53:54 +0000 UTC" deadline="2028-01-08 07:19:33.411816828 +0000 UTC" Apr 16 13:58:55.112923 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.112913 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15161h20m38.298906304s" Apr 16 13:58:55.206816 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.206713 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rts4v" event={"ID":"553f6f6d-061c-4a9d-9ef4-1bbfedfede51","Type":"ContainerStarted","Data":"b6c34d09ddb0a995e09c1c4286402feb4a3f51e5135464dc678d1a6e4ea720c7"} Apr 16 13:58:55.208387 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.208348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9gkrd" event={"ID":"e9cd8c72-8367-4db4-9cb0-b52863ecee83","Type":"ContainerStarted","Data":"6539f3a67765dd0fb43da179186b7a060dfd3114a20035bf1c13dce79b494de7"} Apr 16 13:58:55.210015 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.209976 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" event={"ID":"a2aa2b87-716c-4b0a-abde-a69d0c373e83","Type":"ContainerStarted","Data":"256dd18e0983430427cd9ca30f93a5acebfa44187510a24ce8d117078d194727"} Apr 16 13:58:55.212652 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.212628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r8xw2" event={"ID":"60a323f7-d2f9-401b-bb9d-cde57adebc7d","Type":"ContainerStarted","Data":"9fc5441e5a978ae0cca34951ededce360a36867d8410d65779038d989db1c2fb"} Apr 16 13:58:55.214063 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.214038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" event={"ID":"04f07c727e9c02d9599f1c80512fcf38","Type":"ContainerStarted","Data":"2d938e7136c558b5d3b48e74662a49c88eddf9367cc6dba04c64b5e2287757c5"} Apr 16 13:58:55.215687 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.215540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" event={"ID":"067b94d6835fdbad399c48af24ac5253","Type":"ContainerStarted","Data":"904fe019c29a1881442fa98422d273b81fee5cccba3e937604868ddef7b24694"} Apr 16 13:58:55.216562 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.216535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g8chm" event={"ID":"f6c2db70-a008-4b8a-b25c-881c2d8e9809","Type":"ContainerStarted","Data":"108654a2e544da2e1aa040ac22aa0da19306cea39a06b9386d01fbcb5b0f80f9"} Apr 16 13:58:55.217943 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.217920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mt2qp" event={"ID":"50856c3f-d1b0-4ecc-9979-f3d585acf87b","Type":"ContainerStarted","Data":"aa9b262831dc39279254133f0192faa43024b0fc17d5997c6b7dd52497a355fc"} Apr 16 13:58:55.219119 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.219085 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" event={"ID":"b8beb800-9073-4fd9-81db-5015390a964e","Type":"ContainerStarted","Data":"5aeb82c34c06cd8da54608948edcb9e3d51df7b2b44b371ac64d87010e14cbc8"} Apr 16 13:58:55.277353 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:55.277324 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9cea359_15c5_46e5_af4f_1f2198fc5b08.slice/crio-22a653ff26869c85b27350e0881024594148f601fc5ee052195a12d02ec17d41 WatchSource:0}: Error finding container 22a653ff26869c85b27350e0881024594148f601fc5ee052195a12d02ec17d41: Status 404 returned error can't find the container with id 22a653ff26869c85b27350e0881024594148f601fc5ee052195a12d02ec17d41 Apr 16 13:58:55.358978 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:58:55.358733 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c1f6aa1_4340_4463_8bc6_2ce795e54be0.slice/crio-243be0235dc358b212fffbe9fa743d319fc9c4bb6493f809628f1d0dcb916ff4 WatchSource:0}: Error finding container 243be0235dc358b212fffbe9fa743d319fc9c4bb6493f809628f1d0dcb916ff4: Status 404 returned error can't find the container with id 243be0235dc358b212fffbe9fa743d319fc9c4bb6493f809628f1d0dcb916ff4 Apr 16 13:58:55.405382 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.405357 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:55.650541 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.650508 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-frzm2"] Apr 16 13:58:55.650968 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.650943 2575 predicate.go:212] "Predicate failed on Pod" pod="kube-system/global-pull-secret-syncer-frzm2" err="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 16 13:58:55.651079 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.650966 2575 kubelet.go:2420] "Pod admission denied" podUID="a2407fe4-0a3f-4c30-a00b-5916f1e0b938" pod="kube-system/global-pull-secret-syncer-frzm2" reason="NodeAffinity" message="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 16 13:58:55.685380 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.685317 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kube-system/global-pull-secret-syncer-frzm2"] Apr 16 13:58:55.685529 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.685426 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frzm2" Apr 16 13:58:55.693019 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.692993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4szfz\" (UniqueName: \"kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz\") pod \"network-check-target-xtl4r\" (UID: \"008eaa56-e238-402c-a6f7-ded4fd1a1572\") " pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:58:55.693161 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.693053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:58:55.693233 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:55.693204 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:55.693282 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:55.693257 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs podName:0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e nodeName:}" failed. No retries permitted until 2026-04-16 13:58:57.693238877 +0000 UTC m=+5.029338682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs") pod "network-metrics-daemon-ckxrz" (UID: "0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:55.693612 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.693589 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kube-system/global-pull-secret-syncer-frzm2"] Apr 16 13:58:55.693679 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:55.693650 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:58:55.693679 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:55.693668 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:58:55.693781 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:55.693680 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4szfz for pod openshift-network-diagnostics/network-check-target-xtl4r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:55.693781 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:55.693729 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz podName:008eaa56-e238-402c-a6f7-ded4fd1a1572 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:57.693714503 +0000 UTC m=+5.029814309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4szfz" (UniqueName: "kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz") pod "network-check-target-xtl4r" (UID: "008eaa56-e238-402c-a6f7-ded4fd1a1572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:55.694410 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.694377 2575 status_manager.go:895] "Failed to get status for pod" podUID="a2407fe4-0a3f-4c30-a00b-5916f1e0b938" pod="kube-system/global-pull-secret-syncer-frzm2" err="pods \"global-pull-secret-syncer-frzm2\" is forbidden: User \"system:node:ip-10-0-130-98.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-130-98.ec2.internal' and this object" Apr 16 13:58:55.721635 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.721608 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qqgqk"] Apr 16 13:58:55.722141 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.722122 2575 predicate.go:212] "Predicate failed on Pod" pod="kube-system/global-pull-secret-syncer-qqgqk" err="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 16 13:58:55.722245 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.722143 2575 kubelet.go:2420] "Pod admission denied" podUID="46a88e90-ae0c-43f3-b01f-d5960a090f90" pod="kube-system/global-pull-secret-syncer-qqgqk" reason="NodeAffinity" message="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 16 13:58:55.734407 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:55.734380 2575 status_manager.go:895] "Failed to get status for pod" podUID="a2407fe4-0a3f-4c30-a00b-5916f1e0b938" pod="kube-system/global-pull-secret-syncer-frzm2" err="pods \"global-pull-secret-syncer-frzm2\" is forbidden: User \"system:node:ip-10-0-130-98.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-130-98.ec2.internal' and this object" Apr 16 13:58:56.113514 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.113414 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:53:54 +0000 UTC" deadline="2028-01-13 07:31:04.329418956 +0000 UTC" Apr 16 13:58:56.113514 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.113452 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15281h32m8.215970933s" Apr 16 13:58:56.124326 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.124134 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:56.202504 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.202414 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:58:56.202674 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:56.202537 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:58:56.202674 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.202626 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:58:56.202782 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:56.202732 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:58:56.223107 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.223064 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-54qjm" event={"ID":"4c1f6aa1-4340-4463-8bc6-2ce795e54be0","Type":"ContainerStarted","Data":"243be0235dc358b212fffbe9fa743d319fc9c4bb6493f809628f1d0dcb916ff4"} Apr 16 13:58:56.225412 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.225385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" event={"ID":"a9cea359-15c5-46e5-af4f-1f2198fc5b08","Type":"ContainerStarted","Data":"22a653ff26869c85b27350e0881024594148f601fc5ee052195a12d02ec17d41"} Apr 16 13:58:56.745161 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.745126 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kube-system/global-pull-secret-syncer-qqgqk"] Apr 16 13:58:56.745345 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.745277 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qqgqk" Apr 16 13:58:56.749877 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.749853 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kube-system/global-pull-secret-syncer-qqgqk"] Apr 16 13:58:56.751747 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.751714 2575 status_manager.go:895] "Failed to get status for pod" podUID="a2407fe4-0a3f-4c30-a00b-5916f1e0b938" pod="kube-system/global-pull-secret-syncer-frzm2" err="pods \"global-pull-secret-syncer-frzm2\" is forbidden: User \"system:node:ip-10-0-130-98.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-130-98.ec2.internal' and this object" Apr 16 13:58:56.753468 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.753444 2575 status_manager.go:895] "Failed to get status for pod" podUID="46a88e90-ae0c-43f3-b01f-d5960a090f90" pod="kube-system/global-pull-secret-syncer-qqgqk" err="pods \"global-pull-secret-syncer-qqgqk\" is forbidden: User \"system:node:ip-10-0-130-98.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-130-98.ec2.internal' and this object" Apr 16 13:58:56.755228 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.755206 2575 status_manager.go:895] "Failed to get status for pod" podUID="a2407fe4-0a3f-4c30-a00b-5916f1e0b938" pod="kube-system/global-pull-secret-syncer-frzm2" err="pods \"global-pull-secret-syncer-frzm2\" is forbidden: User \"system:node:ip-10-0-130-98.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-130-98.ec2.internal' and this object" Apr 16 13:58:56.776579 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.776554 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-227h4"] Apr 16 13:58:56.779522 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.779503 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:56.779609 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:56.779585 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:58:56.782290 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.782265 2575 status_manager.go:895] "Failed to get status for pod" podUID="46a88e90-ae0c-43f3-b01f-d5960a090f90" pod="kube-system/global-pull-secret-syncer-qqgqk" err="pods \"global-pull-secret-syncer-qqgqk\" is forbidden: User \"system:node:ip-10-0-130-98.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-130-98.ec2.internal' and this object" Apr 16 13:58:56.795607 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.795577 2575 status_manager.go:895] "Failed to get status for pod" podUID="a2407fe4-0a3f-4c30-a00b-5916f1e0b938" pod="kube-system/global-pull-secret-syncer-frzm2" err="pods \"global-pull-secret-syncer-frzm2\" is forbidden: User \"system:node:ip-10-0-130-98.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-130-98.ec2.internal' and this object" Apr 16 13:58:56.802943 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.802890 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e3464c02-724a-403f-a6b4-6482e6283147-dbus\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:56.802943 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.802930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:56.803115 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.802964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e3464c02-724a-403f-a6b4-6482e6283147-kubelet-config\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:56.903774 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.903735 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e3464c02-724a-403f-a6b4-6482e6283147-dbus\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:56.903971 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.903787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:56.903971 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.903820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e3464c02-724a-403f-a6b4-6482e6283147-kubelet-config\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:56.903971 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.903916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e3464c02-724a-403f-a6b4-6482e6283147-kubelet-config\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:56.904211 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:56.904062 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e3464c02-724a-403f-a6b4-6482e6283147-dbus\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:56.904211 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:56.904175 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:58:56.904313 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:56.904237 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret podName:e3464c02-724a-403f-a6b4-6482e6283147 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:57.404216578 +0000 UTC m=+4.740316383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret") pod "global-pull-secret-syncer-227h4" (UID: "e3464c02-724a-403f-a6b4-6482e6283147") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:58:57.143031 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:57.142998 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:58:57.407612 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:57.407527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:57.407810 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:57.407709 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:58:57.407810 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:57.407772 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret podName:e3464c02-724a-403f-a6b4-6482e6283147 nodeName:}" failed. No retries permitted until 2026-04-16 13:58:58.407754899 +0000 UTC m=+5.743854701 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret") pod "global-pull-secret-syncer-227h4" (UID: "e3464c02-724a-403f-a6b4-6482e6283147") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:58:57.711150 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:57.710453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4szfz\" (UniqueName: \"kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz\") pod \"network-check-target-xtl4r\" (UID: \"008eaa56-e238-402c-a6f7-ded4fd1a1572\") " pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:58:57.711150 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:57.710521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:58:57.711150 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:57.710677 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:57.711150 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:57.710746 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs podName:0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e nodeName:}" failed. No retries permitted until 2026-04-16 13:59:01.71072795 +0000 UTC m=+9.046827756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs") pod "network-metrics-daemon-ckxrz" (UID: "0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:58:57.711150 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:57.711068 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:58:57.711150 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:57.711088 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:58:57.711150 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:57.711113 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4szfz for pod openshift-network-diagnostics/network-check-target-xtl4r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:57.711629 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:57.711160 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz podName:008eaa56-e238-402c-a6f7-ded4fd1a1572 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:01.711144343 +0000 UTC m=+9.047244150 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4szfz" (UniqueName: "kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz") pod "network-check-target-xtl4r" (UID: "008eaa56-e238-402c-a6f7-ded4fd1a1572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:58:58.201856 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:58.201816 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:58:58.202343 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:58.201951 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:58:58.202343 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:58.201977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:58.202343 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:58.201997 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:58:58.202343 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:58.202083 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:58:58.202343 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:58.202206 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:58:58.415748 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:58:58.415719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:58:58.415927 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:58.415869 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:58:58.415996 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:58:58.415934 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret podName:e3464c02-724a-403f-a6b4-6482e6283147 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:00.41591587 +0000 UTC m=+7.752015673 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret") pod "global-pull-secret-syncer-227h4" (UID: "e3464c02-724a-403f-a6b4-6482e6283147") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:00.202027 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:00.201991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:00.202515 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:00.202143 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:00.202934 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:00.202614 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:00.202934 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:00.202713 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:00.202934 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:00.202786 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:00.202934 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:00.202852 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:00.432447 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:00.432409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:00.432633 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:00.432556 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:00.432633 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:00.432620 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret podName:e3464c02-724a-403f-a6b4-6482e6283147 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:04.432600715 +0000 UTC m=+11.768700517 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret") pod "global-pull-secret-syncer-227h4" (UID: "e3464c02-724a-403f-a6b4-6482e6283147") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:01.743669 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:01.743629 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4szfz\" (UniqueName: \"kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz\") pod \"network-check-target-xtl4r\" (UID: \"008eaa56-e238-402c-a6f7-ded4fd1a1572\") " pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:01.744154 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:01.743699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:01.744154 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:01.743787 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:01.744154 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:01.743813 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:01.744154 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:01.743826 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4szfz for pod openshift-network-diagnostics/network-check-target-xtl4r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:01.744154 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:01.743831 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:01.744154 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:01.743886 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs podName:0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e nodeName:}" failed. No retries permitted until 2026-04-16 13:59:09.743867558 +0000 UTC m=+17.079967358 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs") pod "network-metrics-daemon-ckxrz" (UID: "0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:01.744154 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:01.743904 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz podName:008eaa56-e238-402c-a6f7-ded4fd1a1572 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:09.743894564 +0000 UTC m=+17.079994370 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4szfz" (UniqueName: "kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz") pod "network-check-target-xtl4r" (UID: "008eaa56-e238-402c-a6f7-ded4fd1a1572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:02.202397 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:02.202365 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:02.202575 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:02.202501 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:02.203175 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:02.202905 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:02.203175 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:02.203002 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:02.203175 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:02.203052 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:02.203175 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:02.203136 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:04.202801 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:04.202751 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:04.203276 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:04.202814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:04.203276 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:04.202911 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:04.203276 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:04.202969 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:04.203276 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:04.203005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:04.203276 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:04.203130 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:04.465079 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:04.465000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:04.465259 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:04.465140 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:04.465259 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:04.465205 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret podName:e3464c02-724a-403f-a6b4-6482e6283147 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:12.465187174 +0000 UTC m=+19.801286974 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret") pod "global-pull-secret-syncer-227h4" (UID: "e3464c02-724a-403f-a6b4-6482e6283147") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:06.201916 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:06.201884 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:06.202373 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:06.202108 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:06.202373 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:06.202119 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:06.202373 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:06.202140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:06.202373 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:06.202221 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:06.202373 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:06.202293 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:08.202526 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:08.202494 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:08.202970 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:08.202494 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:08.202970 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:08.202603 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:08.202970 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:08.202711 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:08.202970 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:08.202494 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:08.202970 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:08.202800 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:09.803493 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:09.803457 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:09.803877 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:09.803530 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4szfz\" (UniqueName: \"kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz\") pod \"network-check-target-xtl4r\" (UID: \"008eaa56-e238-402c-a6f7-ded4fd1a1572\") " pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:09.803877 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:09.803615 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:09.803877 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:09.803632 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:09.803877 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:09.803646 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:09.803877 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:09.803656 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4szfz for pod openshift-network-diagnostics/network-check-target-xtl4r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:09.803877 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:09.803687 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs podName:0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e nodeName:}" failed. No retries permitted until 2026-04-16 13:59:25.803669127 +0000 UTC m=+33.139768932 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs") pod "network-metrics-daemon-ckxrz" (UID: "0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:09.803877 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:09.803708 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz podName:008eaa56-e238-402c-a6f7-ded4fd1a1572 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:25.803700162 +0000 UTC m=+33.139799970 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4szfz" (UniqueName: "kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz") pod "network-check-target-xtl4r" (UID: "008eaa56-e238-402c-a6f7-ded4fd1a1572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:10.202523 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:10.202492 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:10.202717 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:10.202601 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:10.202717 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:10.202611 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:10.202717 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:10.202633 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:10.202717 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:10.202710 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:10.202925 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:10.202789 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:12.202657 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:12.202631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:12.202657 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:12.202648 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:12.203064 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:12.202668 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:12.203064 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:12.202762 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:12.203064 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:12.202808 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:12.203064 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:12.202898 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:12.523182 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:12.522953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:12.523287 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:12.523110 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:12.523350 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:12.523297 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret podName:e3464c02-724a-403f-a6b4-6482e6283147 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:28.523277079 +0000 UTC m=+35.859376892 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret") pod "global-pull-secret-syncer-227h4" (UID: "e3464c02-724a-403f-a6b4-6482e6283147") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:13.254182 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.254011 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 13:59:13.254714 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.254469 2575 generic.go:358] "Generic (PLEG): container finished" podID="a2aa2b87-716c-4b0a-abde-a69d0c373e83" containerID="a666a01335d750c5590601daaeecefc58269ff46de0367ac58b4863236628dab" exitCode=1 Apr 16 13:59:13.254714 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.254532 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" event={"ID":"a2aa2b87-716c-4b0a-abde-a69d0c373e83","Type":"ContainerStarted","Data":"3246914aa4d0758cd09dbb36d8e12610a59bddcad248c2ebe1c92063d3f8c1c1"} Apr 16 13:59:13.254714 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.254561 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" event={"ID":"a2aa2b87-716c-4b0a-abde-a69d0c373e83","Type":"ContainerStarted","Data":"97ad6b6a1d1545196a871cf7bdcf378ee6f1314dcfb4c51554bb28e4f3e7b633"} Apr 16 13:59:13.254714 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.254574 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" event={"ID":"a2aa2b87-716c-4b0a-abde-a69d0c373e83","Type":"ContainerStarted","Data":"dc53c3c69f08e1697ac9081e417ce34830ab7efb2a809f37844760d63c4eb740"} Apr 16 13:59:13.254714 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.254587 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" event={"ID":"a2aa2b87-716c-4b0a-abde-a69d0c373e83","Type":"ContainerStarted","Data":"5453256cc147598ebef85f54020971cfcfb30f44cfdbe91a17685ce9e7ea8b48"} Apr 16 13:59:13.254714 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.254609 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" event={"ID":"a2aa2b87-716c-4b0a-abde-a69d0c373e83","Type":"ContainerDied","Data":"a666a01335d750c5590601daaeecefc58269ff46de0367ac58b4863236628dab"} Apr 16 13:59:13.254714 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.254620 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" event={"ID":"a2aa2b87-716c-4b0a-abde-a69d0c373e83","Type":"ContainerStarted","Data":"9a64cf594e8ca14ae0a7983ed4a7645c957729e191b8d0207019310e9cda9234"} Apr 16 13:59:13.255768 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.255752 2575 generic.go:358] "Generic (PLEG): container finished" podID="04f07c727e9c02d9599f1c80512fcf38" containerID="ce101c964e9a03ea8b70ccd25dd3504979dbcfe60315ac98aea3d7d2ea61ad3f" exitCode=0 Apr 16 13:59:13.255843 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.255804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" event={"ID":"04f07c727e9c02d9599f1c80512fcf38","Type":"ContainerDied","Data":"ce101c964e9a03ea8b70ccd25dd3504979dbcfe60315ac98aea3d7d2ea61ad3f"} Apr 16 13:59:13.257078 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.257057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" event={"ID":"067b94d6835fdbad399c48af24ac5253","Type":"ContainerStarted","Data":"c1620fb0d6c2e4ec0b38b32b9e82d616619725038ee55663ea897baf38d51154"} Apr 16 13:59:13.259473 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.259443 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-54qjm" event={"ID":"4c1f6aa1-4340-4463-8bc6-2ce795e54be0","Type":"ContainerStarted","Data":"eeddbf0ea8dcc4d4a59cf430208ddd168a2fd6fd43bce957b98eb6a269afd093"} Apr 16 13:59:13.260897 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.260878 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g8chm" event={"ID":"f6c2db70-a008-4b8a-b25c-881c2d8e9809","Type":"ContainerStarted","Data":"acee440ca2f8f16d79981917cf3ba19ecc1e1342ee85181010b98b9097da7fea"} Apr 16 13:59:13.262117 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.261986 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mt2qp" event={"ID":"50856c3f-d1b0-4ecc-9979-f3d585acf87b","Type":"ContainerStarted","Data":"3d94610d4fe401583286be4aa7a0cbfee9d147dd964ad11d366dd6bacab6c55f"} Apr 16 13:59:13.263402 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.263376 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" event={"ID":"b8beb800-9073-4fd9-81db-5015390a964e","Type":"ContainerStarted","Data":"a8cffde95875ecc750d1b6640956be5bfef716d1492981e437a0e2d3aebb29e4"} Apr 16 13:59:13.264721 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.264702 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rts4v" event={"ID":"553f6f6d-061c-4a9d-9ef4-1bbfedfede51","Type":"ContainerStarted","Data":"86a2c0a58d52bba57c3413747f03a964a32797033747d4ff03dbb32a7e7bac54"} Apr 16 13:59:13.265915 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.265897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9gkrd" event={"ID":"e9cd8c72-8367-4db4-9cb0-b52863ecee83","Type":"ContainerStarted","Data":"16e7982ac1b658b47da895f3b7e16e41d6b3be747f0333cc1f3aa4e1c2b24fb9"} Apr 16 13:59:13.267056 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.267040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" event={"ID":"a9cea359-15c5-46e5-af4f-1f2198fc5b08","Type":"ContainerStarted","Data":"1c2cbae66a80f9b9e0958de3b67984719c5be955fb83878ed269c2a5896d04f7"} Apr 16 13:59:13.296314 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.296274 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mt2qp" podStartSLOduration=3.031532762 podStartE2EDuration="20.296262745s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="2026-04-16 13:58:54.877359076 +0000 UTC m=+2.213458875" lastFinishedPulling="2026-04-16 13:59:12.142089055 +0000 UTC m=+19.478188858" observedRunningTime="2026-04-16 13:59:13.296078379 +0000 UTC m=+20.632178200" watchObservedRunningTime="2026-04-16 13:59:13.296262745 +0000 UTC m=+20.632362567" Apr 16 13:59:13.296659 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.296636 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" podStartSLOduration=20.29662946 podStartE2EDuration="20.29662946s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:13.282517747 +0000 UTC m=+20.618617570" watchObservedRunningTime="2026-04-16 13:59:13.29662946 +0000 UTC m=+20.632729281" Apr 16 13:59:13.308813 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.308777 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9gkrd" podStartSLOduration=7.563456142 podStartE2EDuration="20.308766887s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="2026-04-16 13:58:54.859903291 +0000 UTC m=+2.196003091" lastFinishedPulling="2026-04-16 13:59:07.605214025 +0000 UTC m=+14.941313836" observedRunningTime="2026-04-16 13:59:13.308616489 +0000 UTC m=+20.644716313" watchObservedRunningTime="2026-04-16 13:59:13.308766887 +0000 UTC m=+20.644866708" Apr 16 13:59:13.341123 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.341059 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-54qjm" podStartSLOduration=3.539446446 podStartE2EDuration="20.341047951s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="2026-04-16 13:58:55.360931241 +0000 UTC m=+2.697031045" lastFinishedPulling="2026-04-16 13:59:12.162532738 +0000 UTC m=+19.498632550" observedRunningTime="2026-04-16 13:59:13.321676264 +0000 UTC m=+20.657776088" watchObservedRunningTime="2026-04-16 13:59:13.341047951 +0000 UTC m=+20.677147751" Apr 16 13:59:13.359438 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.359405 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rts4v" podStartSLOduration=3.0113921 podStartE2EDuration="20.359396404s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="2026-04-16 13:58:54.87236125 +0000 UTC m=+2.208461050" lastFinishedPulling="2026-04-16 13:59:12.220365538 +0000 UTC m=+19.556465354" observedRunningTime="2026-04-16 13:59:13.359139437 +0000 UTC m=+20.695239259" watchObservedRunningTime="2026-04-16 13:59:13.359396404 +0000 UTC m=+20.695496223" Apr 16 13:59:13.377393 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:13.377358 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8wwv4" podStartSLOduration=8.051468918 podStartE2EDuration="20.377348323s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="2026-04-16 13:58:55.27934268 +0000 UTC m=+2.615442503" lastFinishedPulling="2026-04-16 13:59:07.605222105 +0000 UTC m=+14.941321908" observedRunningTime="2026-04-16 13:59:13.376982308 +0000 UTC m=+20.713082129" watchObservedRunningTime="2026-04-16 13:59:13.377348323 +0000 UTC m=+20.713448145" Apr 16 13:59:14.092880 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.092853 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:14.141874 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.141791 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:14.092878578Z","UUID":"7649e95a-ec44-41e1-9ac9-4925113e9e6c","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:14.143423 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.143404 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:14.143520 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.143433 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:14.201836 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.201703 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:14.201836 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.201722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:14.201836 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.201707 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:14.201836 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:14.201827 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:14.202141 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:14.201906 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:14.202141 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:14.201966 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:14.270720 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.270688 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r8xw2" event={"ID":"60a323f7-d2f9-401b-bb9d-cde57adebc7d","Type":"ContainerStarted","Data":"fea7f9be281f83a62fde01621b7119ac9e2180d599a8945e841bde65795cd65b"} Apr 16 13:59:14.272922 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.272646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" event={"ID":"04f07c727e9c02d9599f1c80512fcf38","Type":"ContainerStarted","Data":"674be7a3c52594f0750aa6ff6ef868471c6fc030a58d8e067894219765ce4e02"} Apr 16 13:59:14.274103 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.274072 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6c2db70-a008-4b8a-b25c-881c2d8e9809" containerID="acee440ca2f8f16d79981917cf3ba19ecc1e1342ee85181010b98b9097da7fea" exitCode=0 Apr 16 13:59:14.274209 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.274125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g8chm" event={"ID":"f6c2db70-a008-4b8a-b25c-881c2d8e9809","Type":"ContainerDied","Data":"acee440ca2f8f16d79981917cf3ba19ecc1e1342ee85181010b98b9097da7fea"} Apr 16 13:59:14.275790 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.275712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" event={"ID":"b8beb800-9073-4fd9-81db-5015390a964e","Type":"ContainerStarted","Data":"d2beac825d54c1ecc6781e5235911cce48e1f2642b016181399f1724ff89ca08"} Apr 16 13:59:14.286276 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.286234 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-r8xw2" podStartSLOduration=13.296597549 podStartE2EDuration="21.286223249s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="2026-04-16 13:58:54.817376508 +0000 UTC m=+2.153476311" lastFinishedPulling="2026-04-16 13:59:02.807002199 +0000 UTC m=+10.143102011" observedRunningTime="2026-04-16 13:59:14.28567576 +0000 UTC m=+21.621775582" watchObservedRunningTime="2026-04-16 13:59:14.286223249 +0000 UTC m=+21.622323070" Apr 16 13:59:14.299815 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:14.299775 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" podStartSLOduration=21.299760873 podStartE2EDuration="21.299760873s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:14.299139066 +0000 UTC m=+21.635238887" watchObservedRunningTime="2026-04-16 13:59:14.299760873 +0000 UTC m=+21.635860697" Apr 16 13:59:15.280761 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:15.280733 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 13:59:15.281405 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:15.281133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" event={"ID":"a2aa2b87-716c-4b0a-abde-a69d0c373e83","Type":"ContainerStarted","Data":"3ad1a5a8c32a3571526c93ce453bc949b04d70c892fda77509f6c769f23442a1"} Apr 16 13:59:15.281618 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:15.281592 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:59:15.282444 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:15.282424 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:59:16.202282 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:16.202253 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:16.202452 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:16.202368 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:16.202452 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:16.202373 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:16.202554 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:16.202475 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:16.202554 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:16.202518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:16.202659 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:16.202611 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:16.284845 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:16.284813 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" event={"ID":"b8beb800-9073-4fd9-81db-5015390a964e","Type":"ContainerStarted","Data":"a72966aac4d78929f051d49207b1795da73d510e0418cdd05dfe3f44e26ef21b"} Apr 16 13:59:16.285386 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:16.285112 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:59:16.285667 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:16.285647 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mt2qp" Apr 16 13:59:16.305235 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:16.305192 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m77hq" podStartSLOduration=2.775946062 podStartE2EDuration="23.305178209s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="2026-04-16 13:58:54.876543312 +0000 UTC m=+2.212643113" lastFinishedPulling="2026-04-16 13:59:15.405775446 +0000 UTC m=+22.741875260" observedRunningTime="2026-04-16 13:59:16.305173465 +0000 UTC m=+23.641273286" watchObservedRunningTime="2026-04-16 13:59:16.305178209 +0000 UTC m=+23.641278031" Apr 16 13:59:17.289136 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:17.288984 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 13:59:17.289560 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:17.289447 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" event={"ID":"a2aa2b87-716c-4b0a-abde-a69d0c373e83","Type":"ContainerStarted","Data":"3de1c1083a532ae6fa3d4a9328523cb1a4be25507ca2996494e7b96e3cc090df"} Apr 16 13:59:17.290109 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:17.290081 2575 scope.go:117] "RemoveContainer" containerID="a666a01335d750c5590601daaeecefc58269ff46de0367ac58b4863236628dab" Apr 16 13:59:18.202597 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.202572 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:18.202726 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.202573 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:18.202726 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:18.202662 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:18.202845 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:18.202762 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:18.202845 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.202586 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:18.202943 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:18.202870 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:18.293282 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.293253 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6c2db70-a008-4b8a-b25c-881c2d8e9809" containerID="959546a28adced3c618d1c5f4ed4e7a927a4e4585f927fa7963864fee81a1047" exitCode=0 Apr 16 13:59:18.293742 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.293343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g8chm" event={"ID":"f6c2db70-a008-4b8a-b25c-881c2d8e9809","Type":"ContainerDied","Data":"959546a28adced3c618d1c5f4ed4e7a927a4e4585f927fa7963864fee81a1047"} Apr 16 13:59:18.296561 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.296544 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 13:59:18.296910 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.296893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" event={"ID":"a2aa2b87-716c-4b0a-abde-a69d0c373e83","Type":"ContainerStarted","Data":"fc41bfa0f6ca0acf148ce5bfa66057625ab69b8efe8e82e7ffb426faa8549abf"} Apr 16 13:59:18.297201 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.297184 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:59:18.297370 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.297355 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:59:18.297472 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.297383 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:59:18.320435 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.320410 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:59:18.320681 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.320658 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:59:18.346709 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:18.346661 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" podStartSLOduration=8.019786785 podStartE2EDuration="25.346650561s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="2026-04-16 13:58:54.924188164 +0000 UTC m=+2.260287969" lastFinishedPulling="2026-04-16 13:59:12.251051945 +0000 UTC m=+19.587151745" observedRunningTime="2026-04-16 13:59:18.346336089 +0000 UTC m=+25.682435910" watchObservedRunningTime="2026-04-16 13:59:18.346650561 +0000 UTC m=+25.682750415" Apr 16 13:59:19.190393 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:19.190215 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ckxrz"] Apr 16 13:59:19.190705 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:19.190511 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:19.190705 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:19.190632 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:19.193283 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:19.193261 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xtl4r"] Apr 16 13:59:19.193423 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:19.193367 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:19.193494 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:19.193460 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:19.194201 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:19.194160 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-227h4"] Apr 16 13:59:19.194308 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:19.194250 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:19.194379 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:19.194349 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:20.301691 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:20.301655 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6c2db70-a008-4b8a-b25c-881c2d8e9809" containerID="a52a82fa55b913143506d60d98ade70934a3566db679cfea054df6ab5f23fef1" exitCode=0 Apr 16 13:59:20.302384 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:20.301707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g8chm" event={"ID":"f6c2db70-a008-4b8a-b25c-881c2d8e9809","Type":"ContainerDied","Data":"a52a82fa55b913143506d60d98ade70934a3566db679cfea054df6ab5f23fef1"} Apr 16 13:59:21.204257 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:21.204231 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:21.204386 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:21.204267 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:21.204386 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:21.204268 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:21.204386 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:21.204353 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:21.204483 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:21.204450 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:21.204518 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:21.204507 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:22.307877 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:22.307673 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6c2db70-a008-4b8a-b25c-881c2d8e9809" containerID="4c3e7df7b22edcc6c4594c61e39c5d8e25cac1f9f055726173758d069e7e100f" exitCode=0 Apr 16 13:59:22.308349 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:22.307752 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g8chm" event={"ID":"f6c2db70-a008-4b8a-b25c-881c2d8e9809","Type":"ContainerDied","Data":"4c3e7df7b22edcc6c4594c61e39c5d8e25cac1f9f055726173758d069e7e100f"} Apr 16 13:59:23.203051 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:23.203018 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:23.203250 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:23.203155 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:23.203250 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:23.203187 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:23.203250 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:23.203235 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:23.203250 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:23.203230 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:23.203446 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:23.203349 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:25.201834 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.201800 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:25.202328 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.201800 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:25.202328 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.201940 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 13:59:25.202328 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.201800 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:25.202328 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.202013 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtl4r" podUID="008eaa56-e238-402c-a6f7-ded4fd1a1572" Apr 16 13:59:25.202328 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.202078 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-227h4" podUID="e3464c02-724a-403f-a6b4-6482e6283147" Apr 16 13:59:25.467967 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.467940 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeReady" Apr 16 13:59:25.468169 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.468087 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:59:25.508395 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.508368 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zhlzs"] Apr 16 13:59:25.511593 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.511574 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-q9qxf"] Apr 16 13:59:25.511752 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.511730 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.514112 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.514063 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:59:25.514112 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.514109 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2c9fp\"" Apr 16 13:59:25.514286 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.514131 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:59:25.514478 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.514462 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:25.516738 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.516517 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:59:25.516738 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.516561 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.516738 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.516650 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wz4rp\"" Apr 16 13:59:25.516738 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.516650 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.521383 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.521361 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zhlzs"] Apr 16 13:59:25.534996 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.534972 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q9qxf"] Apr 16 13:59:25.625990 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.625949 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbe5971a-6df2-42bd-b7eb-09f552154f0d-config-volume\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.626165 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.625995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6smhq\" (UniqueName: \"kubernetes.io/projected/fbe5971a-6df2-42bd-b7eb-09f552154f0d-kube-api-access-6smhq\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.626165 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.626071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:25.626165 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.626154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.626285 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.626182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbe5971a-6df2-42bd-b7eb-09f552154f0d-tmp-dir\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.626285 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.626233 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75srt\" (UniqueName: \"kubernetes.io/projected/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-kube-api-access-75srt\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:25.726787 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.726701 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.726787 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.726744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbe5971a-6df2-42bd-b7eb-09f552154f0d-tmp-dir\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.726787 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.726772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75srt\" (UniqueName: \"kubernetes.io/projected/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-kube-api-access-75srt\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:25.727022 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.726822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbe5971a-6df2-42bd-b7eb-09f552154f0d-config-volume\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.727022 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.726844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6smhq\" (UniqueName: \"kubernetes.io/projected/fbe5971a-6df2-42bd-b7eb-09f552154f0d-kube-api-access-6smhq\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.727022 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.726859 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:25.727022 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.726898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:25.727022 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.726933 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls podName:fbe5971a-6df2-42bd-b7eb-09f552154f0d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:26.22691227 +0000 UTC m=+33.563012071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls") pod "dns-default-zhlzs" (UID: "fbe5971a-6df2-42bd-b7eb-09f552154f0d") : secret "dns-default-metrics-tls" not found Apr 16 13:59:25.727022 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.726987 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:25.727308 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.727034 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert podName:2fbaae4a-0d84-4121-bda1-d36fab54ac7d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:26.227019368 +0000 UTC m=+33.563119169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert") pod "ingress-canary-q9qxf" (UID: "2fbaae4a-0d84-4121-bda1-d36fab54ac7d") : secret "canary-serving-cert" not found Apr 16 13:59:25.727308 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.727157 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbe5971a-6df2-42bd-b7eb-09f552154f0d-tmp-dir\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.727595 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.727575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbe5971a-6df2-42bd-b7eb-09f552154f0d-config-volume\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.740475 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.740432 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6smhq\" (UniqueName: \"kubernetes.io/projected/fbe5971a-6df2-42bd-b7eb-09f552154f0d-kube-api-access-6smhq\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:25.740582 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.740559 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75srt\" (UniqueName: \"kubernetes.io/projected/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-kube-api-access-75srt\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:25.828226 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.828190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4szfz\" (UniqueName: \"kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz\") pod \"network-check-target-xtl4r\" (UID: \"008eaa56-e238-402c-a6f7-ded4fd1a1572\") " pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:25.828393 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:25.828246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:25.828393 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.828371 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:25.828504 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.828397 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:25.828504 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.828465 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs podName:0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.828447489 +0000 UTC m=+65.164547291 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs") pod "network-metrics-daemon-ckxrz" (UID: "0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:25.828504 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.828400 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:25.828656 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.828509 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4szfz for pod openshift-network-diagnostics/network-check-target-xtl4r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:25.828656 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:25.828579 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz podName:008eaa56-e238-402c-a6f7-ded4fd1a1572 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.828561862 +0000 UTC m=+65.164661679 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-4szfz" (UniqueName: "kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz") pod "network-check-target-xtl4r" (UID: "008eaa56-e238-402c-a6f7-ded4fd1a1572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:26.230932 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:26.230897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:26.231449 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:26.231045 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:26.231449 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:26.231109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:26.231449 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:26.231131 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls podName:fbe5971a-6df2-42bd-b7eb-09f552154f0d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:27.23111092 +0000 UTC m=+34.567210739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls") pod "dns-default-zhlzs" (UID: "fbe5971a-6df2-42bd-b7eb-09f552154f0d") : secret "dns-default-metrics-tls" not found Apr 16 13:59:26.231449 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:26.231187 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:26.231449 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:26.231247 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert podName:2fbaae4a-0d84-4121-bda1-d36fab54ac7d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:27.231228655 +0000 UTC m=+34.567328469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert") pod "ingress-canary-q9qxf" (UID: "2fbaae4a-0d84-4121-bda1-d36fab54ac7d") : secret "canary-serving-cert" not found Apr 16 13:59:27.202107 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:27.202063 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:27.202282 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:27.202115 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:27.202282 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:27.202142 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:27.206221 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:27.206196 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:59:27.206375 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:27.206356 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b98gv\"" Apr 16 13:59:27.206443 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:27.206367 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:59:27.206443 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:27.206430 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:59:27.206545 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:27.206448 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:59:27.206545 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:27.206381 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j4r87\"" Apr 16 13:59:27.238566 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:27.238544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:27.238912 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:27.238581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:27.238912 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:27.238665 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:27.238912 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:27.238667 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:27.238912 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:27.238713 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls podName:fbe5971a-6df2-42bd-b7eb-09f552154f0d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.238700114 +0000 UTC m=+36.574799913 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls") pod "dns-default-zhlzs" (UID: "fbe5971a-6df2-42bd-b7eb-09f552154f0d") : secret "dns-default-metrics-tls" not found Apr 16 13:59:27.238912 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:27.238725 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert podName:2fbaae4a-0d84-4121-bda1-d36fab54ac7d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.238719939 +0000 UTC m=+36.574819739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert") pod "ingress-canary-q9qxf" (UID: "2fbaae4a-0d84-4121-bda1-d36fab54ac7d") : secret "canary-serving-cert" not found Apr 16 13:59:28.548508 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:28.548472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:28.551669 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:28.551639 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e3464c02-724a-403f-a6b4-6482e6283147-original-pull-secret\") pod \"global-pull-secret-syncer-227h4\" (UID: \"e3464c02-724a-403f-a6b4-6482e6283147\") " pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:28.713670 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:28.713637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-227h4" Apr 16 13:59:28.912696 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:28.912502 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-227h4"] Apr 16 13:59:29.038489 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:59:29.038453 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3464c02_724a_403f_a6b4_6482e6283147.slice/crio-cdf379572426db2700c2e9c45a20f9a5931569bd002a78d63e7c5ae327f2c7a2 WatchSource:0}: Error finding container cdf379572426db2700c2e9c45a20f9a5931569bd002a78d63e7c5ae327f2c7a2: Status 404 returned error can't find the container with id cdf379572426db2700c2e9c45a20f9a5931569bd002a78d63e7c5ae327f2c7a2 Apr 16 13:59:29.255105 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:29.255064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:29.255210 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:29.255182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:29.255210 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:29.255200 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:29.255294 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:29.255261 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:29.255294 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:29.255276 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls podName:fbe5971a-6df2-42bd-b7eb-09f552154f0d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:33.255256041 +0000 UTC m=+40.591355858 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls") pod "dns-default-zhlzs" (UID: "fbe5971a-6df2-42bd-b7eb-09f552154f0d") : secret "dns-default-metrics-tls" not found Apr 16 13:59:29.255294 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:29.255293 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert podName:2fbaae4a-0d84-4121-bda1-d36fab54ac7d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:33.2552831 +0000 UTC m=+40.591382919 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert") pod "ingress-canary-q9qxf" (UID: "2fbaae4a-0d84-4121-bda1-d36fab54ac7d") : secret "canary-serving-cert" not found Apr 16 13:59:29.324188 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:29.324151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-227h4" event={"ID":"e3464c02-724a-403f-a6b4-6482e6283147","Type":"ContainerStarted","Data":"cdf379572426db2700c2e9c45a20f9a5931569bd002a78d63e7c5ae327f2c7a2"} Apr 16 13:59:29.326629 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:29.326603 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g8chm" event={"ID":"f6c2db70-a008-4b8a-b25c-881c2d8e9809","Type":"ContainerStarted","Data":"f9481b2f87b21a8dcfa24556c8f06a59efd3028192bfa00fa3d1b1a61ad708e4"} Apr 16 13:59:30.330893 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:30.330859 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6c2db70-a008-4b8a-b25c-881c2d8e9809" containerID="f9481b2f87b21a8dcfa24556c8f06a59efd3028192bfa00fa3d1b1a61ad708e4" exitCode=0 Apr 16 13:59:30.331511 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:30.330943 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g8chm" event={"ID":"f6c2db70-a008-4b8a-b25c-881c2d8e9809","Type":"ContainerDied","Data":"f9481b2f87b21a8dcfa24556c8f06a59efd3028192bfa00fa3d1b1a61ad708e4"} Apr 16 13:59:31.336219 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:31.336180 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6c2db70-a008-4b8a-b25c-881c2d8e9809" containerID="794ea714eeff665514fa8101b9ca469a52a9a9e67b928c0f50f116f04d5f9bc0" exitCode=0 Apr 16 13:59:31.336751 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:31.336251 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g8chm" event={"ID":"f6c2db70-a008-4b8a-b25c-881c2d8e9809","Type":"ContainerDied","Data":"794ea714eeff665514fa8101b9ca469a52a9a9e67b928c0f50f116f04d5f9bc0"} Apr 16 13:59:33.288343 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:33.288306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:33.288764 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:33.288351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:33.288764 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:33.288442 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:33.288764 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:33.288458 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:33.288764 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:33.288490 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls podName:fbe5971a-6df2-42bd-b7eb-09f552154f0d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.288477523 +0000 UTC m=+48.624577322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls") pod "dns-default-zhlzs" (UID: "fbe5971a-6df2-42bd-b7eb-09f552154f0d") : secret "dns-default-metrics-tls" not found Apr 16 13:59:33.288764 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:33.288518 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert podName:2fbaae4a-0d84-4121-bda1-d36fab54ac7d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.288500722 +0000 UTC m=+48.624600543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert") pod "ingress-canary-q9qxf" (UID: "2fbaae4a-0d84-4121-bda1-d36fab54ac7d") : secret "canary-serving-cert" not found Apr 16 13:59:33.341358 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:33.341326 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-227h4" event={"ID":"e3464c02-724a-403f-a6b4-6482e6283147","Type":"ContainerStarted","Data":"ca7c6fb9a4db9db5ccc34d822be68744910bde499a6c72fdcce37ddc92b158c7"} Apr 16 13:59:33.343963 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:33.343939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g8chm" event={"ID":"f6c2db70-a008-4b8a-b25c-881c2d8e9809","Type":"ContainerStarted","Data":"3710b73e4661e744e87536113778fe1df0aec27efe3c849c0e78b3dab0d78833"} Apr 16 13:59:33.356317 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:33.356276 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-227h4" podStartSLOduration=33.840417304 podStartE2EDuration="37.356263967s" podCreationTimestamp="2026-04-16 13:58:56 +0000 UTC" firstStartedPulling="2026-04-16 13:59:29.064935144 +0000 UTC m=+36.401034957" lastFinishedPulling="2026-04-16 13:59:32.58078182 +0000 UTC m=+39.916881620" observedRunningTime="2026-04-16 13:59:33.355783982 +0000 UTC m=+40.691883804" watchObservedRunningTime="2026-04-16 13:59:33.356263967 +0000 UTC m=+40.692363767" Apr 16 13:59:33.378238 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:33.378198 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g8chm" podStartSLOduration=6.18913051 podStartE2EDuration="40.378186059s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="2026-04-16 13:58:54.898544767 +0000 UTC m=+2.234644567" lastFinishedPulling="2026-04-16 13:59:29.087600313 +0000 UTC m=+36.423700116" observedRunningTime="2026-04-16 13:59:33.376134203 +0000 UTC m=+40.712234025" watchObservedRunningTime="2026-04-16 13:59:33.378186059 +0000 UTC m=+40.714285882" Apr 16 13:59:41.342579 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:41.342536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:41.342579 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:41.342584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:41.343079 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:41.342684 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:41.343079 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:41.342686 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:41.343079 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:41.342735 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls podName:fbe5971a-6df2-42bd-b7eb-09f552154f0d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.342721336 +0000 UTC m=+64.678821136 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls") pod "dns-default-zhlzs" (UID: "fbe5971a-6df2-42bd-b7eb-09f552154f0d") : secret "dns-default-metrics-tls" not found Apr 16 13:59:41.343079 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:41.342750 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert podName:2fbaae4a-0d84-4121-bda1-d36fab54ac7d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.342743908 +0000 UTC m=+64.678843707 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert") pod "ingress-canary-q9qxf" (UID: "2fbaae4a-0d84-4121-bda1-d36fab54ac7d") : secret "canary-serving-cert" not found Apr 16 13:59:50.315332 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:50.315303 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cdqzf" Apr 16 13:59:57.352463 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:57.352411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 13:59:57.352463 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:57.352471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 13:59:57.353004 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:57.352567 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:57.353004 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:57.352578 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:57.353004 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:57.352618 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls podName:fbe5971a-6df2-42bd-b7eb-09f552154f0d nodeName:}" failed. No retries permitted until 2026-04-16 14:00:29.352604085 +0000 UTC m=+96.688703884 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls") pod "dns-default-zhlzs" (UID: "fbe5971a-6df2-42bd-b7eb-09f552154f0d") : secret "dns-default-metrics-tls" not found Apr 16 13:59:57.353004 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:57.352653 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert podName:2fbaae4a-0d84-4121-bda1-d36fab54ac7d nodeName:}" failed. No retries permitted until 2026-04-16 14:00:29.352633263 +0000 UTC m=+96.688733074 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert") pod "ingress-canary-q9qxf" (UID: "2fbaae4a-0d84-4121-bda1-d36fab54ac7d") : secret "canary-serving-cert" not found Apr 16 13:59:57.856755 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:57.856723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 13:59:57.856913 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:57.856786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4szfz\" (UniqueName: \"kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz\") pod \"network-check-target-xtl4r\" (UID: \"008eaa56-e238-402c-a6f7-ded4fd1a1572\") " pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:57.859280 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:57.859251 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:59:57.859441 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:57.859429 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:59:57.867307 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:57.867289 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:59:57.867398 ip-10-0-130-98 kubenswrapper[2575]: E0416 13:59:57.867358 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs podName:0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e nodeName:}" failed. No retries permitted until 2026-04-16 14:01:01.867334275 +0000 UTC m=+129.203434082 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs") pod "network-metrics-daemon-ckxrz" (UID: "0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e") : secret "metrics-daemon-secret" not found Apr 16 13:59:57.869853 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:57.869835 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:59:57.880389 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:57.880371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4szfz\" (UniqueName: \"kubernetes.io/projected/008eaa56-e238-402c-a6f7-ded4fd1a1572-kube-api-access-4szfz\") pod \"network-check-target-xtl4r\" (UID: \"008eaa56-e238-402c-a6f7-ded4fd1a1572\") " pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:58.123925 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.123845 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j4r87\"" Apr 16 13:59:58.131748 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.131725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 13:59:58.260854 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.260821 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xtl4r"] Apr 16 13:59:58.264156 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:59:58.264120 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod008eaa56_e238_402c_a6f7_ded4fd1a1572.slice/crio-02bc445e442a9e4f0283d6a6b0416b2ddc89b1f1ea7cb55962c3c3776a1af6b3 WatchSource:0}: Error finding container 02bc445e442a9e4f0283d6a6b0416b2ddc89b1f1ea7cb55962c3c3776a1af6b3: Status 404 returned error can't find the container with id 02bc445e442a9e4f0283d6a6b0416b2ddc89b1f1ea7cb55962c3c3776a1af6b3 Apr 16 13:59:58.389507 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.389426 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xtl4r" event={"ID":"008eaa56-e238-402c-a6f7-ded4fd1a1572","Type":"ContainerStarted","Data":"02bc445e442a9e4f0283d6a6b0416b2ddc89b1f1ea7cb55962c3c3776a1af6b3"} Apr 16 13:59:58.517443 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.517410 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr"] Apr 16 13:59:58.523344 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.523317 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr" Apr 16 13:59:58.525792 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.525771 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 13:59:58.526948 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.526922 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-6vwm6\"" Apr 16 13:59:58.527037 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.526963 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 13:59:58.527037 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.526990 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 13:59:58.527037 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.526963 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 13:59:58.529384 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.529363 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr"] Apr 16 13:59:58.663812 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.663711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbp87\" (UniqueName: \"kubernetes.io/projected/bdf2d029-e4d6-48db-9fd8-6139318e9a8f-kube-api-access-cbp87\") pod \"managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr\" (UID: \"bdf2d029-e4d6-48db-9fd8-6139318e9a8f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr" Apr 16 13:59:58.663812 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.663770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bdf2d029-e4d6-48db-9fd8-6139318e9a8f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr\" (UID: \"bdf2d029-e4d6-48db-9fd8-6139318e9a8f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr" Apr 16 13:59:58.765140 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.765113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbp87\" (UniqueName: \"kubernetes.io/projected/bdf2d029-e4d6-48db-9fd8-6139318e9a8f-kube-api-access-cbp87\") pod \"managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr\" (UID: \"bdf2d029-e4d6-48db-9fd8-6139318e9a8f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr" Apr 16 13:59:58.765253 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.765162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bdf2d029-e4d6-48db-9fd8-6139318e9a8f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr\" (UID: \"bdf2d029-e4d6-48db-9fd8-6139318e9a8f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr" Apr 16 13:59:58.767554 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.767529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bdf2d029-e4d6-48db-9fd8-6139318e9a8f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr\" (UID: \"bdf2d029-e4d6-48db-9fd8-6139318e9a8f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr" Apr 16 13:59:58.773578 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.773553 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbp87\" (UniqueName: \"kubernetes.io/projected/bdf2d029-e4d6-48db-9fd8-6139318e9a8f-kube-api-access-cbp87\") pod \"managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr\" (UID: \"bdf2d029-e4d6-48db-9fd8-6139318e9a8f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr" Apr 16 13:59:58.847125 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.847089 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr" Apr 16 13:59:58.957821 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:58.957795 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr"] Apr 16 13:59:58.960921 ip-10-0-130-98 kubenswrapper[2575]: W0416 13:59:58.960897 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdf2d029_e4d6_48db_9fd8_6139318e9a8f.slice/crio-8bdd66cc64b743d8fcf87aa3a2081fe01dbc13f9260c619d1a4ae4b3948d2180 WatchSource:0}: Error finding container 8bdd66cc64b743d8fcf87aa3a2081fe01dbc13f9260c619d1a4ae4b3948d2180: Status 404 returned error can't find the container with id 8bdd66cc64b743d8fcf87aa3a2081fe01dbc13f9260c619d1a4ae4b3948d2180 Apr 16 13:59:59.392444 ip-10-0-130-98 kubenswrapper[2575]: I0416 13:59:59.392409 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr" event={"ID":"bdf2d029-e4d6-48db-9fd8-6139318e9a8f","Type":"ContainerStarted","Data":"8bdd66cc64b743d8fcf87aa3a2081fe01dbc13f9260c619d1a4ae4b3948d2180"} Apr 16 14:00:01.397622 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:01.397593 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xtl4r" event={"ID":"008eaa56-e238-402c-a6f7-ded4fd1a1572","Type":"ContainerStarted","Data":"26e1eb75b481b0d3b342b1c52e000f7456f557ce11037952f5269895a4cf9bd2"} Apr 16 14:00:01.397925 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:01.397784 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 14:00:01.412849 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:01.412813 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xtl4r" podStartSLOduration=65.396918647 podStartE2EDuration="1m8.412800888s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="2026-04-16 13:59:58.265888092 +0000 UTC m=+65.601987892" lastFinishedPulling="2026-04-16 14:00:01.281770331 +0000 UTC m=+68.617870133" observedRunningTime="2026-04-16 14:00:01.412183945 +0000 UTC m=+68.748283763" watchObservedRunningTime="2026-04-16 14:00:01.412800888 +0000 UTC m=+68.748900711" Apr 16 14:00:04.404033 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:04.403996 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr" event={"ID":"bdf2d029-e4d6-48db-9fd8-6139318e9a8f","Type":"ContainerStarted","Data":"d0c3fb62d9c1df3a2b6c7a10a656d47416a13a26c6e252c97c3871373945bd9d"} Apr 16 14:00:04.419625 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:04.419578 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69c7b4bf89-qqlfr" podStartSLOduration=1.4844359200000001 podStartE2EDuration="6.419565396s" podCreationTimestamp="2026-04-16 13:59:58 +0000 UTC" firstStartedPulling="2026-04-16 13:59:58.963031332 +0000 UTC m=+66.299131132" lastFinishedPulling="2026-04-16 14:00:03.898160803 +0000 UTC m=+71.234260608" observedRunningTime="2026-04-16 14:00:04.418457809 +0000 UTC m=+71.754557625" watchObservedRunningTime="2026-04-16 14:00:04.419565396 +0000 UTC m=+71.755665218" Apr 16 14:00:29.371657 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:29.371615 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 14:00:29.371657 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:29.371654 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 14:00:29.372151 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:29.371757 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:29.372151 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:29.371800 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:29.372151 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:29.371819 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert podName:2fbaae4a-0d84-4121-bda1-d36fab54ac7d nodeName:}" failed. No retries permitted until 2026-04-16 14:01:33.3718038 +0000 UTC m=+160.707903600 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert") pod "ingress-canary-q9qxf" (UID: "2fbaae4a-0d84-4121-bda1-d36fab54ac7d") : secret "canary-serving-cert" not found Apr 16 14:00:29.372151 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:29.371838 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls podName:fbe5971a-6df2-42bd-b7eb-09f552154f0d nodeName:}" failed. No retries permitted until 2026-04-16 14:01:33.371826834 +0000 UTC m=+160.707926634 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls") pod "dns-default-zhlzs" (UID: "fbe5971a-6df2-42bd-b7eb-09f552154f0d") : secret "dns-default-metrics-tls" not found Apr 16 14:00:32.401970 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:32.401941 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xtl4r" Apr 16 14:00:51.359897 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.359865 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-b645448c4-hb2lw"] Apr 16 14:00:51.361559 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.361544 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.365079 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.365047 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 14:00:51.365223 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.365173 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 14:00:51.365333 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.365284 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 14:00:51.366191 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.366161 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:00:51.366191 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.366170 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 14:00:51.366354 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.366218 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vptwp\"" Apr 16 14:00:51.366354 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.366236 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:00:51.384345 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.384317 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b645448c4-hb2lw"] Apr 16 14:00:51.419163 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.419141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.419276 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.419171 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-stats-auth\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.419276 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.419190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.419276 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.419207 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bhhr\" (UniqueName: \"kubernetes.io/projected/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-kube-api-access-2bhhr\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.419388 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.419294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-default-certificate\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.455347 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.455322 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-t8smf"] Apr 16 14:00:51.457783 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.457760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.460667 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.460647 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-bpcdc\"" Apr 16 14:00:51.460939 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.460924 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:00:51.461041 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.461025 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 14:00:51.461127 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.461044 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 14:00:51.461127 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.461121 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:00:51.466421 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.466404 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 14:00:51.476695 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.476675 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-t8smf"] Apr 16 14:00:51.519703 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.519678 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6daf2345-5eea-4faa-9720-9390a947e6ce-tmp\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.519815 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.519709 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6daf2345-5eea-4faa-9720-9390a947e6ce-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.519815 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.519730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77g49\" (UniqueName: \"kubernetes.io/projected/6daf2345-5eea-4faa-9720-9390a947e6ce-kube-api-access-77g49\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.519815 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.519751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-default-certificate\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.519815 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.519770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6daf2345-5eea-4faa-9720-9390a947e6ce-snapshots\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.519815 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.519790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6daf2345-5eea-4faa-9720-9390a947e6ce-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.519991 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.519872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.519991 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.519902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-stats-auth\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.519991 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.519918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.519991 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.519933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bhhr\" (UniqueName: \"kubernetes.io/projected/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-kube-api-access-2bhhr\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.519991 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.519951 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6daf2345-5eea-4faa-9720-9390a947e6ce-serving-cert\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.520224 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:51.520040 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:51.520224 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:51.520069 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:00:52.020049682 +0000 UTC m=+119.356149495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:51.520224 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:51.520125 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:00:52.020086563 +0000 UTC m=+119.356186376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : secret "router-metrics-certs-default" not found Apr 16 14:00:51.522124 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.522083 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-stats-auth\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.522204 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.522139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-default-certificate\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.529024 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.529002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bhhr\" (UniqueName: \"kubernetes.io/projected/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-kube-api-access-2bhhr\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:51.621318 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.621230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6daf2345-5eea-4faa-9720-9390a947e6ce-tmp\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.621318 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.621270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6daf2345-5eea-4faa-9720-9390a947e6ce-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.621318 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.621291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77g49\" (UniqueName: \"kubernetes.io/projected/6daf2345-5eea-4faa-9720-9390a947e6ce-kube-api-access-77g49\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.621573 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.621410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6daf2345-5eea-4faa-9720-9390a947e6ce-snapshots\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.621573 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.621453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6daf2345-5eea-4faa-9720-9390a947e6ce-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.621573 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.621524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6daf2345-5eea-4faa-9720-9390a947e6ce-serving-cert\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.621723 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.621660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6daf2345-5eea-4faa-9720-9390a947e6ce-tmp\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.622136 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.622086 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/6daf2345-5eea-4faa-9720-9390a947e6ce-snapshots\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.622491 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.622473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6daf2345-5eea-4faa-9720-9390a947e6ce-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.622624 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.622607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6daf2345-5eea-4faa-9720-9390a947e6ce-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.623784 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.623764 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6daf2345-5eea-4faa-9720-9390a947e6ce-serving-cert\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.629063 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.629040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77g49\" (UniqueName: \"kubernetes.io/projected/6daf2345-5eea-4faa-9720-9390a947e6ce-kube-api-access-77g49\") pod \"insights-operator-5785d4fcdd-t8smf\" (UID: \"6daf2345-5eea-4faa-9720-9390a947e6ce\") " pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.766901 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.766874 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" Apr 16 14:00:51.879774 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:51.879590 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-t8smf"] Apr 16 14:00:51.881974 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:00:51.881948 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6daf2345_5eea_4faa_9720_9390a947e6ce.slice/crio-dec779d4cecfee6a187b89554ad3ec7cf3480ffc68b61fb60e60aaf0cf03cb1b WatchSource:0}: Error finding container dec779d4cecfee6a187b89554ad3ec7cf3480ffc68b61fb60e60aaf0cf03cb1b: Status 404 returned error can't find the container with id dec779d4cecfee6a187b89554ad3ec7cf3480ffc68b61fb60e60aaf0cf03cb1b Apr 16 14:00:52.026053 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:52.026017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:52.026053 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:52.026055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:52.026264 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:52.026186 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:52.026264 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:52.026209 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:00:53.026187897 +0000 UTC m=+120.362287708 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:52.026264 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:52.026230 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:00:53.026221277 +0000 UTC m=+120.362321085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : secret "router-metrics-certs-default" not found Apr 16 14:00:52.492280 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:52.492232 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" event={"ID":"6daf2345-5eea-4faa-9720-9390a947e6ce","Type":"ContainerStarted","Data":"dec779d4cecfee6a187b89554ad3ec7cf3480ffc68b61fb60e60aaf0cf03cb1b"} Apr 16 14:00:53.032781 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:53.032746 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:53.032781 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:53.032786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:53.032984 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:53.032903 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:53.032984 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:53.032922 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:00:55.032901239 +0000 UTC m=+122.369001055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:53.032984 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:53.032943 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:00:55.032931576 +0000 UTC m=+122.369031376 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : secret "router-metrics-certs-default" not found Apr 16 14:00:54.497441 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:54.497402 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" event={"ID":"6daf2345-5eea-4faa-9720-9390a947e6ce","Type":"ContainerStarted","Data":"f0978bb32a827a1a7aff8d1dd0c2f7e7ed7863ddb9e6b67c1a9ca43c32fff103"} Apr 16 14:00:54.514770 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:54.514723 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" podStartSLOduration=1.7380601260000001 podStartE2EDuration="3.514710332s" podCreationTimestamp="2026-04-16 14:00:51 +0000 UTC" firstStartedPulling="2026-04-16 14:00:51.883606337 +0000 UTC m=+119.219706141" lastFinishedPulling="2026-04-16 14:00:53.660256536 +0000 UTC m=+120.996356347" observedRunningTime="2026-04-16 14:00:54.512795064 +0000 UTC m=+121.848894887" watchObservedRunningTime="2026-04-16 14:00:54.514710332 +0000 UTC m=+121.850810155" Apr 16 14:00:55.046379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:55.046345 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:55.046379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:55.046384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:55.046599 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:55.046474 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:55.046599 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:55.046517 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:00:59.046497487 +0000 UTC m=+126.382597287 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:55.046599 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:55.046542 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:00:59.046534681 +0000 UTC m=+126.382634481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : secret "router-metrics-certs-default" not found Apr 16 14:00:56.456290 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:56.456261 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9gkrd_e9cd8c72-8367-4db4-9cb0-b52863ecee83/dns-node-resolver/0.log" Apr 16 14:00:57.256534 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:57.256507 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-54qjm_4c1f6aa1-4340-4463-8bc6-2ce795e54be0/node-ca/0.log" Apr 16 14:00:59.075559 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:59.075529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:59.075559 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:00:59.075562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:00:59.075977 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:59.075669 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:59.075977 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:59.075694 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:01:07.075676868 +0000 UTC m=+134.411776672 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:59.075977 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:00:59.075715 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:01:07.075708646 +0000 UTC m=+134.411808446 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : secret "router-metrics-certs-default" not found Apr 16 14:01:01.419128 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.419079 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7"] Apr 16 14:01:01.421985 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.421965 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" Apr 16 14:01:01.422259 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.422237 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks"] Apr 16 14:01:01.424454 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.424432 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 14:01:01.424557 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.424456 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 14:01:01.424557 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.424469 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-p78sh\"" Apr 16 14:01:01.424557 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.424458 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:01:01.424682 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.424662 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" Apr 16 14:01:01.425353 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.425337 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 14:01:01.426868 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.426848 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 14:01:01.426962 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.426854 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 14:01:01.426962 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.426931 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:01:01.426962 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.426957 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 14:01:01.427128 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.427066 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-jm92s\"" Apr 16 14:01:01.431085 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.431067 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7"] Apr 16 14:01:01.434272 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.434253 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks"] Apr 16 14:01:01.491171 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.491137 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl77g\" (UniqueName: \"kubernetes.io/projected/3994a6a1-6bea-406a-a079-922dfadd77da-kube-api-access-pl77g\") pod \"kube-storage-version-migrator-operator-756bb7d76f-497ks\" (UID: \"3994a6a1-6bea-406a-a079-922dfadd77da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" Apr 16 14:01:01.491305 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.491229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3994a6a1-6bea-406a-a079-922dfadd77da-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-497ks\" (UID: \"3994a6a1-6bea-406a-a079-922dfadd77da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" Apr 16 14:01:01.491305 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.491295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f6678c-3de8-4562-b077-c6a1db9f26a6-config\") pod \"service-ca-operator-69965bb79d-tjxm7\" (UID: \"e8f6678c-3de8-4562-b077-c6a1db9f26a6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" Apr 16 14:01:01.491399 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.491335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f6678c-3de8-4562-b077-c6a1db9f26a6-serving-cert\") pod \"service-ca-operator-69965bb79d-tjxm7\" (UID: \"e8f6678c-3de8-4562-b077-c6a1db9f26a6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" Apr 16 14:01:01.491399 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.491359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s89c\" (UniqueName: \"kubernetes.io/projected/e8f6678c-3de8-4562-b077-c6a1db9f26a6-kube-api-access-4s89c\") pod \"service-ca-operator-69965bb79d-tjxm7\" (UID: \"e8f6678c-3de8-4562-b077-c6a1db9f26a6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" Apr 16 14:01:01.491399 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.491393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3994a6a1-6bea-406a-a079-922dfadd77da-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-497ks\" (UID: \"3994a6a1-6bea-406a-a079-922dfadd77da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" Apr 16 14:01:01.591676 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.591642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3994a6a1-6bea-406a-a079-922dfadd77da-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-497ks\" (UID: \"3994a6a1-6bea-406a-a079-922dfadd77da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" Apr 16 14:01:01.591857 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.591704 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f6678c-3de8-4562-b077-c6a1db9f26a6-config\") pod \"service-ca-operator-69965bb79d-tjxm7\" (UID: \"e8f6678c-3de8-4562-b077-c6a1db9f26a6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" Apr 16 14:01:01.591857 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.591723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f6678c-3de8-4562-b077-c6a1db9f26a6-serving-cert\") pod \"service-ca-operator-69965bb79d-tjxm7\" (UID: \"e8f6678c-3de8-4562-b077-c6a1db9f26a6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" Apr 16 14:01:01.591857 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.591739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4s89c\" (UniqueName: \"kubernetes.io/projected/e8f6678c-3de8-4562-b077-c6a1db9f26a6-kube-api-access-4s89c\") pod \"service-ca-operator-69965bb79d-tjxm7\" (UID: \"e8f6678c-3de8-4562-b077-c6a1db9f26a6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" Apr 16 14:01:01.592015 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.591846 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3994a6a1-6bea-406a-a079-922dfadd77da-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-497ks\" (UID: \"3994a6a1-6bea-406a-a079-922dfadd77da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" Apr 16 14:01:01.592015 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.591906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pl77g\" (UniqueName: \"kubernetes.io/projected/3994a6a1-6bea-406a-a079-922dfadd77da-kube-api-access-pl77g\") pod \"kube-storage-version-migrator-operator-756bb7d76f-497ks\" (UID: \"3994a6a1-6bea-406a-a079-922dfadd77da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" Apr 16 14:01:01.592403 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.592379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3994a6a1-6bea-406a-a079-922dfadd77da-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-497ks\" (UID: \"3994a6a1-6bea-406a-a079-922dfadd77da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" Apr 16 14:01:01.592497 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.592380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f6678c-3de8-4562-b077-c6a1db9f26a6-config\") pod \"service-ca-operator-69965bb79d-tjxm7\" (UID: \"e8f6678c-3de8-4562-b077-c6a1db9f26a6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" Apr 16 14:01:01.593937 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.593910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3994a6a1-6bea-406a-a079-922dfadd77da-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-497ks\" (UID: \"3994a6a1-6bea-406a-a079-922dfadd77da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" Apr 16 14:01:01.594041 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.593995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f6678c-3de8-4562-b077-c6a1db9f26a6-serving-cert\") pod \"service-ca-operator-69965bb79d-tjxm7\" (UID: \"e8f6678c-3de8-4562-b077-c6a1db9f26a6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" Apr 16 14:01:01.599179 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.599157 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s89c\" (UniqueName: \"kubernetes.io/projected/e8f6678c-3de8-4562-b077-c6a1db9f26a6-kube-api-access-4s89c\") pod \"service-ca-operator-69965bb79d-tjxm7\" (UID: \"e8f6678c-3de8-4562-b077-c6a1db9f26a6\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" Apr 16 14:01:01.599915 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.599898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl77g\" (UniqueName: \"kubernetes.io/projected/3994a6a1-6bea-406a-a079-922dfadd77da-kube-api-access-pl77g\") pod \"kube-storage-version-migrator-operator-756bb7d76f-497ks\" (UID: \"3994a6a1-6bea-406a-a079-922dfadd77da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" Apr 16 14:01:01.732072 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.732010 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" Apr 16 14:01:01.737668 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.737643 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" Apr 16 14:01:01.852058 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.852033 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7"] Apr 16 14:01:01.854948 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:01.854921 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8f6678c_3de8_4562_b077_c6a1db9f26a6.slice/crio-3f881dd6d3d0f80f74fcc8358ca8d659f6e903ef5f5386fc6e6d73cd0d9dc51f WatchSource:0}: Error finding container 3f881dd6d3d0f80f74fcc8358ca8d659f6e903ef5f5386fc6e6d73cd0d9dc51f: Status 404 returned error can't find the container with id 3f881dd6d3d0f80f74fcc8358ca8d659f6e903ef5f5386fc6e6d73cd0d9dc51f Apr 16 14:01:01.869411 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.869389 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks"] Apr 16 14:01:01.873839 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:01.873807 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3994a6a1_6bea_406a_a079_922dfadd77da.slice/crio-a856fb75fc1f2d88413b2bb7f2de7d79d8c5292735c5996a84be5ba7eebb5d2f WatchSource:0}: Error finding container a856fb75fc1f2d88413b2bb7f2de7d79d8c5292735c5996a84be5ba7eebb5d2f: Status 404 returned error can't find the container with id a856fb75fc1f2d88413b2bb7f2de7d79d8c5292735c5996a84be5ba7eebb5d2f Apr 16 14:01:01.894759 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:01.894739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 14:01:01.894901 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:01.894883 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:01:01.894948 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:01.894942 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs podName:0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e nodeName:}" failed. No retries permitted until 2026-04-16 14:03:03.894924239 +0000 UTC m=+251.231024040 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs") pod "network-metrics-daemon-ckxrz" (UID: "0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e") : secret "metrics-daemon-secret" not found Apr 16 14:01:02.513737 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:02.513693 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" event={"ID":"3994a6a1-6bea-406a-a079-922dfadd77da","Type":"ContainerStarted","Data":"a856fb75fc1f2d88413b2bb7f2de7d79d8c5292735c5996a84be5ba7eebb5d2f"} Apr 16 14:01:02.514909 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:02.514876 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" event={"ID":"e8f6678c-3de8-4562-b077-c6a1db9f26a6","Type":"ContainerStarted","Data":"3f881dd6d3d0f80f74fcc8358ca8d659f6e903ef5f5386fc6e6d73cd0d9dc51f"} Apr 16 14:01:02.914695 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:02.914657 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-bc58db64c-htv7v"] Apr 16 14:01:02.917650 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:02.917623 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:02.920353 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:02.920329 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:01:02.920459 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:02.920362 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:01:02.920682 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:02.920657 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gdggq\"" Apr 16 14:01:02.920802 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:02.920665 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:01:02.931344 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:02.927897 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:01:02.931519 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:02.931497 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-bc58db64c-htv7v"] Apr 16 14:01:03.003269 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.003227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74588845-f50d-4443-a6d6-c3b0c18f9b81-ca-trust-extracted\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.003442 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.003294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.003442 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.003367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-certificates\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.003567 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.003472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-installation-pull-secrets\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.003567 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.003514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-bound-sa-token\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.003567 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.003546 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-image-registry-private-configuration\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.003709 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.003575 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbp6j\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-kube-api-access-wbp6j\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.003709 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.003609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-trusted-ca\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.104253 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.104215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74588845-f50d-4443-a6d6-c3b0c18f9b81-ca-trust-extracted\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.104429 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.104270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.104429 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.104327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-certificates\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.104429 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.104368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-installation-pull-secrets\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.104429 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.104393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-bound-sa-token\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.104429 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.104422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-image-registry-private-configuration\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.104676 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.104447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbp6j\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-kube-api-access-wbp6j\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.104676 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.104477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-trusted-ca\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.104768 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:03.104734 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:03.104768 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:03.104759 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc58db64c-htv7v: secret "image-registry-tls" not found Apr 16 14:01:03.104873 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:03.104822 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls podName:74588845-f50d-4443-a6d6-c3b0c18f9b81 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:03.604800542 +0000 UTC m=+130.940900356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls") pod "image-registry-bc58db64c-htv7v" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81") : secret "image-registry-tls" not found Apr 16 14:01:03.105560 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.105536 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-trusted-ca\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.105683 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.105544 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74588845-f50d-4443-a6d6-c3b0c18f9b81-ca-trust-extracted\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.105744 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.105704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-certificates\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.107959 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.107889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-installation-pull-secrets\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.108396 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.108360 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-image-registry-private-configuration\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.113946 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.113900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbp6j\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-kube-api-access-wbp6j\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.114463 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.114443 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-bound-sa-token\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.608929 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:03.608894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:03.609390 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:03.609078 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:03.609390 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:03.609120 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc58db64c-htv7v: secret "image-registry-tls" not found Apr 16 14:01:03.609390 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:03.609194 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls podName:74588845-f50d-4443-a6d6-c3b0c18f9b81 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:04.60917034 +0000 UTC m=+131.945270159 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls") pod "image-registry-bc58db64c-htv7v" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81") : secret "image-registry-tls" not found Apr 16 14:01:04.520834 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:04.520797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" event={"ID":"3994a6a1-6bea-406a-a079-922dfadd77da","Type":"ContainerStarted","Data":"3abacd27a672719eaec93610b8cf2b6e3c4e129715209ddcfc63aa011b68ed65"} Apr 16 14:01:04.522140 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:04.522116 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" event={"ID":"e8f6678c-3de8-4562-b077-c6a1db9f26a6","Type":"ContainerStarted","Data":"0c5913743743dc25120a65989c8dcffeccdc616affe182162dc550726bef6eae"} Apr 16 14:01:04.536308 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:04.536256 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" podStartSLOduration=1.412568926 podStartE2EDuration="3.536241828s" podCreationTimestamp="2026-04-16 14:01:01 +0000 UTC" firstStartedPulling="2026-04-16 14:01:01.876204554 +0000 UTC m=+129.212304355" lastFinishedPulling="2026-04-16 14:01:03.999877448 +0000 UTC m=+131.335977257" observedRunningTime="2026-04-16 14:01:04.535665814 +0000 UTC m=+131.871765635" watchObservedRunningTime="2026-04-16 14:01:04.536241828 +0000 UTC m=+131.872341647" Apr 16 14:01:04.550554 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:04.550502 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" podStartSLOduration=1.408468059 podStartE2EDuration="3.550487377s" podCreationTimestamp="2026-04-16 14:01:01 +0000 UTC" firstStartedPulling="2026-04-16 14:01:01.856697929 +0000 UTC m=+129.192797729" lastFinishedPulling="2026-04-16 14:01:03.998717233 +0000 UTC m=+131.334817047" observedRunningTime="2026-04-16 14:01:04.550004384 +0000 UTC m=+131.886104206" watchObservedRunningTime="2026-04-16 14:01:04.550487377 +0000 UTC m=+131.886587199" Apr 16 14:01:04.618886 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:04.618842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:04.619357 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:04.618991 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:04.619357 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:04.619016 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc58db64c-htv7v: secret "image-registry-tls" not found Apr 16 14:01:04.619357 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:04.619089 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls podName:74588845-f50d-4443-a6d6-c3b0c18f9b81 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:06.619065687 +0000 UTC m=+133.955165487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls") pod "image-registry-bc58db64c-htv7v" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81") : secret "image-registry-tls" not found Apr 16 14:01:06.637669 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:06.637634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:06.638043 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:06.637738 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:06.638043 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:06.637749 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc58db64c-htv7v: secret "image-registry-tls" not found Apr 16 14:01:06.638043 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:06.637797 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls podName:74588845-f50d-4443-a6d6-c3b0c18f9b81 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:10.63778276 +0000 UTC m=+137.973882560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls") pod "image-registry-bc58db64c-htv7v" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81") : secret "image-registry-tls" not found Apr 16 14:01:07.141028 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.140987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:01:07.141220 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.141069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:01:07.141220 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:07.141183 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:01:23.141164479 +0000 UTC m=+150.477264279 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : configmap references non-existent config key: service-ca.crt Apr 16 14:01:07.141220 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:07.141191 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:01:07.141324 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:07.141224 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs podName:6892d97c-f802-4e48-b3ea-ce73b8dbbafa nodeName:}" failed. No retries permitted until 2026-04-16 14:01:23.141217627 +0000 UTC m=+150.477317427 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs") pod "router-default-b645448c4-hb2lw" (UID: "6892d97c-f802-4e48-b3ea-ce73b8dbbafa") : secret "router-metrics-certs-default" not found Apr 16 14:01:07.731689 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.731654 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-p9n6q"] Apr 16 14:01:07.734502 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.734484 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" Apr 16 14:01:07.736969 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.736946 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 14:01:07.737085 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.736968 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 14:01:07.737172 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.737137 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 14:01:07.738182 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.738163 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-w6h6p\"" Apr 16 14:01:07.738277 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.738166 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 14:01:07.743274 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.743255 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-p9n6q"] Apr 16 14:01:07.847797 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.847755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9dfcd035-e7d3-43a7-98b6-cc2be16e077b-signing-key\") pod \"service-ca-bfc587fb7-p9n6q\" (UID: \"9dfcd035-e7d3-43a7-98b6-cc2be16e077b\") " pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" Apr 16 14:01:07.847797 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.847796 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2vsl\" (UniqueName: \"kubernetes.io/projected/9dfcd035-e7d3-43a7-98b6-cc2be16e077b-kube-api-access-t2vsl\") pod \"service-ca-bfc587fb7-p9n6q\" (UID: \"9dfcd035-e7d3-43a7-98b6-cc2be16e077b\") " pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" Apr 16 14:01:07.848010 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.847817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9dfcd035-e7d3-43a7-98b6-cc2be16e077b-signing-cabundle\") pod \"service-ca-bfc587fb7-p9n6q\" (UID: \"9dfcd035-e7d3-43a7-98b6-cc2be16e077b\") " pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" Apr 16 14:01:07.948802 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.948763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9dfcd035-e7d3-43a7-98b6-cc2be16e077b-signing-key\") pod \"service-ca-bfc587fb7-p9n6q\" (UID: \"9dfcd035-e7d3-43a7-98b6-cc2be16e077b\") " pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" Apr 16 14:01:07.948802 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.948802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2vsl\" (UniqueName: \"kubernetes.io/projected/9dfcd035-e7d3-43a7-98b6-cc2be16e077b-kube-api-access-t2vsl\") pod \"service-ca-bfc587fb7-p9n6q\" (UID: \"9dfcd035-e7d3-43a7-98b6-cc2be16e077b\") " pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" Apr 16 14:01:07.949050 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.948825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9dfcd035-e7d3-43a7-98b6-cc2be16e077b-signing-cabundle\") pod \"service-ca-bfc587fb7-p9n6q\" (UID: \"9dfcd035-e7d3-43a7-98b6-cc2be16e077b\") " pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" Apr 16 14:01:07.949455 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.949434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9dfcd035-e7d3-43a7-98b6-cc2be16e077b-signing-cabundle\") pod \"service-ca-bfc587fb7-p9n6q\" (UID: \"9dfcd035-e7d3-43a7-98b6-cc2be16e077b\") " pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" Apr 16 14:01:07.951146 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.951130 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9dfcd035-e7d3-43a7-98b6-cc2be16e077b-signing-key\") pod \"service-ca-bfc587fb7-p9n6q\" (UID: \"9dfcd035-e7d3-43a7-98b6-cc2be16e077b\") " pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" Apr 16 14:01:07.958792 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:07.958770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2vsl\" (UniqueName: \"kubernetes.io/projected/9dfcd035-e7d3-43a7-98b6-cc2be16e077b-kube-api-access-t2vsl\") pod \"service-ca-bfc587fb7-p9n6q\" (UID: \"9dfcd035-e7d3-43a7-98b6-cc2be16e077b\") " pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" Apr 16 14:01:08.043394 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.043307 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" Apr 16 14:01:08.114552 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.114524 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pwsbt"] Apr 16 14:01:08.119582 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.119562 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.122228 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.122210 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:01:08.122387 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.122370 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lstgd\"" Apr 16 14:01:08.123595 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.123577 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:01:08.129010 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.128768 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pwsbt"] Apr 16 14:01:08.158445 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.158415 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-p9n6q"] Apr 16 14:01:08.161058 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:08.161031 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dfcd035_e7d3_43a7_98b6_cc2be16e077b.slice/crio-250dcd598c7959ba85477b61efa9ade69d5c7e3f2a3df3b46a43f79574572e7a WatchSource:0}: Error finding container 250dcd598c7959ba85477b61efa9ade69d5c7e3f2a3df3b46a43f79574572e7a: Status 404 returned error can't find the container with id 250dcd598c7959ba85477b61efa9ade69d5c7e3f2a3df3b46a43f79574572e7a Apr 16 14:01:08.250809 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.250774 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/de5bb2a7-cca1-47fc-87e5-761a06491018-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.250952 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.250847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/de5bb2a7-cca1-47fc-87e5-761a06491018-data-volume\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.250952 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.250930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.251031 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.250978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/de5bb2a7-cca1-47fc-87e5-761a06491018-crio-socket\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.251031 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.251007 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq6lw\" (UniqueName: \"kubernetes.io/projected/de5bb2a7-cca1-47fc-87e5-761a06491018-kube-api-access-vq6lw\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.351803 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.351775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/de5bb2a7-cca1-47fc-87e5-761a06491018-data-volume\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.351948 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.351812 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.351948 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.351839 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/de5bb2a7-cca1-47fc-87e5-761a06491018-crio-socket\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.351948 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.351894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/de5bb2a7-cca1-47fc-87e5-761a06491018-crio-socket\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.352087 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:08.351951 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:01:08.352087 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.351961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vq6lw\" (UniqueName: \"kubernetes.io/projected/de5bb2a7-cca1-47fc-87e5-761a06491018-kube-api-access-vq6lw\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.352087 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:08.352047 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls podName:de5bb2a7-cca1-47fc-87e5-761a06491018 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:08.852025341 +0000 UTC m=+136.188125144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls") pod "insights-runtime-extractor-pwsbt" (UID: "de5bb2a7-cca1-47fc-87e5-761a06491018") : secret "insights-runtime-extractor-tls" not found Apr 16 14:01:08.352257 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.352180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/de5bb2a7-cca1-47fc-87e5-761a06491018-data-volume\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.352257 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.352185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/de5bb2a7-cca1-47fc-87e5-761a06491018-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.353145 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.353129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/de5bb2a7-cca1-47fc-87e5-761a06491018-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.364845 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.364814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq6lw\" (UniqueName: \"kubernetes.io/projected/de5bb2a7-cca1-47fc-87e5-761a06491018-kube-api-access-vq6lw\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.533408 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.533373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" event={"ID":"9dfcd035-e7d3-43a7-98b6-cc2be16e077b","Type":"ContainerStarted","Data":"32c364e10ded89ae717afe31877495c7600042f163acce0de4a83d4972537c65"} Apr 16 14:01:08.533408 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.533410 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" event={"ID":"9dfcd035-e7d3-43a7-98b6-cc2be16e077b","Type":"ContainerStarted","Data":"250dcd598c7959ba85477b61efa9ade69d5c7e3f2a3df3b46a43f79574572e7a"} Apr 16 14:01:08.552582 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.552533 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-p9n6q" podStartSLOduration=1.552519121 podStartE2EDuration="1.552519121s" podCreationTimestamp="2026-04-16 14:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:08.551449916 +0000 UTC m=+135.887549742" watchObservedRunningTime="2026-04-16 14:01:08.552519121 +0000 UTC m=+135.888618983" Apr 16 14:01:08.856636 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:08.856602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:08.856986 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:08.856713 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:01:08.856986 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:08.856820 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls podName:de5bb2a7-cca1-47fc-87e5-761a06491018 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:09.85677363 +0000 UTC m=+137.192873430 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls") pod "insights-runtime-extractor-pwsbt" (UID: "de5bb2a7-cca1-47fc-87e5-761a06491018") : secret "insights-runtime-extractor-tls" not found Apr 16 14:01:09.879854 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:09.879816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:09.880343 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:09.880059 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:01:09.880343 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:09.880176 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls podName:de5bb2a7-cca1-47fc-87e5-761a06491018 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:11.880153877 +0000 UTC m=+139.216253683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls") pod "insights-runtime-extractor-pwsbt" (UID: "de5bb2a7-cca1-47fc-87e5-761a06491018") : secret "insights-runtime-extractor-tls" not found Apr 16 14:01:10.687313 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:10.687279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:10.687481 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:10.687417 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:10.687481 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:10.687437 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bc58db64c-htv7v: secret "image-registry-tls" not found Apr 16 14:01:10.687564 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:10.687502 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls podName:74588845-f50d-4443-a6d6-c3b0c18f9b81 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:18.68748682 +0000 UTC m=+146.023586625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls") pod "image-registry-bc58db64c-htv7v" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81") : secret "image-registry-tls" not found Apr 16 14:01:11.897370 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:11.897336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:11.897740 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:11.897506 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:01:11.897740 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:11.897598 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls podName:de5bb2a7-cca1-47fc-87e5-761a06491018 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:15.897578252 +0000 UTC m=+143.233678054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls") pod "insights-runtime-extractor-pwsbt" (UID: "de5bb2a7-cca1-47fc-87e5-761a06491018") : secret "insights-runtime-extractor-tls" not found Apr 16 14:01:15.933474 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:15.933435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:15.935862 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:15.935829 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/de5bb2a7-cca1-47fc-87e5-761a06491018-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pwsbt\" (UID: \"de5bb2a7-cca1-47fc-87e5-761a06491018\") " pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:16.231675 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:16.231577 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pwsbt" Apr 16 14:01:16.352016 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:16.351985 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pwsbt"] Apr 16 14:01:16.355067 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:16.355034 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5bb2a7_cca1_47fc_87e5_761a06491018.slice/crio-1224f9a03d12cbab7edd41203aacd357ebb30a18d74e6fefae4625c6de94b3ea WatchSource:0}: Error finding container 1224f9a03d12cbab7edd41203aacd357ebb30a18d74e6fefae4625c6de94b3ea: Status 404 returned error can't find the container with id 1224f9a03d12cbab7edd41203aacd357ebb30a18d74e6fefae4625c6de94b3ea Apr 16 14:01:16.556656 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:16.556569 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pwsbt" event={"ID":"de5bb2a7-cca1-47fc-87e5-761a06491018","Type":"ContainerStarted","Data":"f5679183cce744755409ac1049c9f78621a98c0a972d4a27a3dd2432def790d7"} Apr 16 14:01:16.556656 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:16.556603 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pwsbt" event={"ID":"de5bb2a7-cca1-47fc-87e5-761a06491018","Type":"ContainerStarted","Data":"1224f9a03d12cbab7edd41203aacd357ebb30a18d74e6fefae4625c6de94b3ea"} Apr 16 14:01:17.560668 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:17.560632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pwsbt" event={"ID":"de5bb2a7-cca1-47fc-87e5-761a06491018","Type":"ContainerStarted","Data":"671ae19fc0ef4411cb60149c126dd8939b853d3a9f79f252e7a867982d959b9c"} Apr 16 14:01:18.755795 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:18.755747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:18.758575 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:18.758543 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls\") pod \"image-registry-bc58db64c-htv7v\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:18.835083 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:18.835056 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:18.966061 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:18.966030 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-bc58db64c-htv7v"] Apr 16 14:01:18.969056 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:18.969028 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74588845_f50d_4443_a6d6_c3b0c18f9b81.slice/crio-932eb04394fcbb67863373c02cb17d82ac668334f03fbb7b743adfc2cca0943f WatchSource:0}: Error finding container 932eb04394fcbb67863373c02cb17d82ac668334f03fbb7b743adfc2cca0943f: Status 404 returned error can't find the container with id 932eb04394fcbb67863373c02cb17d82ac668334f03fbb7b743adfc2cca0943f Apr 16 14:01:19.566870 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:19.566836 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pwsbt" event={"ID":"de5bb2a7-cca1-47fc-87e5-761a06491018","Type":"ContainerStarted","Data":"d2e1ec1cea4ba0573ebf4eb4d7a6bb56f5a405c24ff61dcbb33979dbbd6d88de"} Apr 16 14:01:19.568052 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:19.568025 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" event={"ID":"74588845-f50d-4443-a6d6-c3b0c18f9b81","Type":"ContainerStarted","Data":"09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a"} Apr 16 14:01:19.568222 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:19.568058 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" event={"ID":"74588845-f50d-4443-a6d6-c3b0c18f9b81","Type":"ContainerStarted","Data":"932eb04394fcbb67863373c02cb17d82ac668334f03fbb7b743adfc2cca0943f"} Apr 16 14:01:19.568222 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:19.568144 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:19.588440 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:19.588389 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pwsbt" podStartSLOduration=9.177803469 podStartE2EDuration="11.588375133s" podCreationTimestamp="2026-04-16 14:01:08 +0000 UTC" firstStartedPulling="2026-04-16 14:01:16.421427003 +0000 UTC m=+143.757526803" lastFinishedPulling="2026-04-16 14:01:18.831998644 +0000 UTC m=+146.168098467" observedRunningTime="2026-04-16 14:01:19.586827062 +0000 UTC m=+146.922926895" watchObservedRunningTime="2026-04-16 14:01:19.588375133 +0000 UTC m=+146.924474955" Apr 16 14:01:19.603898 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:19.603848 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" podStartSLOduration=17.603824308 podStartE2EDuration="17.603824308s" podCreationTimestamp="2026-04-16 14:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:19.602771518 +0000 UTC m=+146.938871342" watchObservedRunningTime="2026-04-16 14:01:19.603824308 +0000 UTC m=+146.939924129" Apr 16 14:01:23.192819 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:23.192780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:01:23.193227 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:23.192861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:01:23.193918 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:23.193900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-service-ca-bundle\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:01:23.195189 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:23.195166 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6892d97c-f802-4e48-b3ea-ce73b8dbbafa-metrics-certs\") pod \"router-default-b645448c4-hb2lw\" (UID: \"6892d97c-f802-4e48-b3ea-ce73b8dbbafa\") " pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:01:23.472607 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:23.472530 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vptwp\"" Apr 16 14:01:23.480466 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:23.480446 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:01:23.616467 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:23.616443 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b645448c4-hb2lw"] Apr 16 14:01:23.618611 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:23.618584 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6892d97c_f802_4e48_b3ea_ce73b8dbbafa.slice/crio-6a87e0f19c91664bd0b6c31f48bd42b2d6e5d93860cbe2e208a9571110e373f4 WatchSource:0}: Error finding container 6a87e0f19c91664bd0b6c31f48bd42b2d6e5d93860cbe2e208a9571110e373f4: Status 404 returned error can't find the container with id 6a87e0f19c91664bd0b6c31f48bd42b2d6e5d93860cbe2e208a9571110e373f4 Apr 16 14:01:24.581628 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:24.581594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b645448c4-hb2lw" event={"ID":"6892d97c-f802-4e48-b3ea-ce73b8dbbafa","Type":"ContainerStarted","Data":"0bb705d429b8e7b857c46d07f28ece8bf318ecd569b5dcaada741b06fc3737eb"} Apr 16 14:01:24.581628 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:24.581629 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b645448c4-hb2lw" event={"ID":"6892d97c-f802-4e48-b3ea-ce73b8dbbafa","Type":"ContainerStarted","Data":"6a87e0f19c91664bd0b6c31f48bd42b2d6e5d93860cbe2e208a9571110e373f4"} Apr 16 14:01:24.599636 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:24.599591 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-b645448c4-hb2lw" podStartSLOduration=33.599577427 podStartE2EDuration="33.599577427s" podCreationTimestamp="2026-04-16 14:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:24.599065084 +0000 UTC m=+151.935164907" watchObservedRunningTime="2026-04-16 14:01:24.599577427 +0000 UTC m=+151.935677251" Apr 16 14:01:25.480847 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:25.480804 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:01:25.483566 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:25.483543 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:01:25.589246 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:25.589213 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:01:25.590513 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:25.590483 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-b645448c4-hb2lw" Apr 16 14:01:26.654911 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:26.654873 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bc58db64c-htv7v"] Apr 16 14:01:28.526115 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:28.526041 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-zhlzs" podUID="fbe5971a-6df2-42bd-b7eb-09f552154f0d" Apr 16 14:01:28.533166 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:28.533135 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-q9qxf" podUID="2fbaae4a-0d84-4121-bda1-d36fab54ac7d" Apr 16 14:01:28.597322 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:28.597295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 14:01:28.597461 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:28.597448 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zhlzs" Apr 16 14:01:30.225953 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:30.225915 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-ckxrz" podUID="0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e" Apr 16 14:01:33.376415 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.376362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 14:01:33.376805 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.376540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 14:01:33.378803 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.378777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbe5971a-6df2-42bd-b7eb-09f552154f0d-metrics-tls\") pod \"dns-default-zhlzs\" (UID: \"fbe5971a-6df2-42bd-b7eb-09f552154f0d\") " pod="openshift-dns/dns-default-zhlzs" Apr 16 14:01:33.378914 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.378888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fbaae4a-0d84-4121-bda1-d36fab54ac7d-cert\") pod \"ingress-canary-q9qxf\" (UID: \"2fbaae4a-0d84-4121-bda1-d36fab54ac7d\") " pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 14:01:33.400294 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.400270 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2c9fp\"" Apr 16 14:01:33.401043 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.401028 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wz4rp\"" Apr 16 14:01:33.407905 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.407891 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q9qxf" Apr 16 14:01:33.407981 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.407917 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zhlzs" Apr 16 14:01:33.536896 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.536839 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zhlzs"] Apr 16 14:01:33.539873 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:33.539844 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbe5971a_6df2_42bd_b7eb_09f552154f0d.slice/crio-1403645e33d9cdf1875ea4f934ab6bfc791369132bc9d63c5a2f9f28c249422e WatchSource:0}: Error finding container 1403645e33d9cdf1875ea4f934ab6bfc791369132bc9d63c5a2f9f28c249422e: Status 404 returned error can't find the container with id 1403645e33d9cdf1875ea4f934ab6bfc791369132bc9d63c5a2f9f28c249422e Apr 16 14:01:33.550484 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.550450 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q9qxf"] Apr 16 14:01:33.552664 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:33.552640 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fbaae4a_0d84_4121_bda1_d36fab54ac7d.slice/crio-dffc981155915cf23372198810cb6b654b345157416441fae0d2378782e3b882 WatchSource:0}: Error finding container dffc981155915cf23372198810cb6b654b345157416441fae0d2378782e3b882: Status 404 returned error can't find the container with id dffc981155915cf23372198810cb6b654b345157416441fae0d2378782e3b882 Apr 16 14:01:33.610907 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.610875 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q9qxf" event={"ID":"2fbaae4a-0d84-4121-bda1-d36fab54ac7d","Type":"ContainerStarted","Data":"dffc981155915cf23372198810cb6b654b345157416441fae0d2378782e3b882"} Apr 16 14:01:33.611850 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:33.611830 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zhlzs" event={"ID":"fbe5971a-6df2-42bd-b7eb-09f552154f0d","Type":"ContainerStarted","Data":"1403645e33d9cdf1875ea4f934ab6bfc791369132bc9d63c5a2f9f28c249422e"} Apr 16 14:01:35.205857 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.205830 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-78656"] Apr 16 14:01:35.211378 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.211358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.213704 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.213679 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:01:35.214004 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.213989 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8tfc5\"" Apr 16 14:01:35.214151 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.214132 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 14:01:35.214963 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.214938 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:01:35.215065 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.214994 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:01:35.215065 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.215003 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:01:35.216811 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.216788 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-78656"] Apr 16 14:01:35.225454 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.225433 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-64pd9"] Apr 16 14:01:35.228596 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.228578 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.232449 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.232431 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:01:35.232611 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.232543 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-cwdvq\"" Apr 16 14:01:35.232703 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.232623 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 14:01:35.232782 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.232720 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 14:01:35.233595 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.233575 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2869s"] Apr 16 14:01:35.236964 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.236945 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.239371 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.239352 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:01:35.239481 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.239351 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:01:35.239481 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.239352 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nv4cs\"" Apr 16 14:01:35.239481 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.239351 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:01:35.243692 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.243672 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-64pd9"] Apr 16 14:01:35.292644 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.292616 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.292800 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.292662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.292800 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.292718 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-root\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.292800 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.292771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-textfile\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.292972 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.292822 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csmv2\" (UniqueName: \"kubernetes.io/projected/d6354208-1c66-449f-90c8-56fb4357138b-kube-api-access-csmv2\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.292972 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.292870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-sys\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.292972 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.292941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-wtmp\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.293122 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.292970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6354208-1c66-449f-90c8-56fb4357138b-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.293122 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.292993 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-accelerators-collector-config\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.293122 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.293030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.293122 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.293062 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.293122 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.293089 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.293334 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.293137 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.293334 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.293171 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z22c\" (UniqueName: \"kubernetes.io/projected/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-api-access-4z22c\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.293334 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.293207 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.293334 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.293230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-metrics-client-ca\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.293334 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.293253 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh97t\" (UniqueName: \"kubernetes.io/projected/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-kube-api-access-zh97t\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.293334 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.293292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-tls\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.293334 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.293332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.394663 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.394619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh97t\" (UniqueName: \"kubernetes.io/projected/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-kube-api-access-zh97t\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.394840 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.394688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-tls\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.394840 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.394720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.394840 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.394752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.394840 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.394783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.394840 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.394836 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-root\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.395088 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.394863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-textfile\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.395088 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.394895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csmv2\" (UniqueName: \"kubernetes.io/projected/d6354208-1c66-449f-90c8-56fb4357138b-kube-api-access-csmv2\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.395088 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.394921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-sys\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.395088 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.394956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-wtmp\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.395088 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.394983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6354208-1c66-449f-90c8-56fb4357138b-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.395088 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.395008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-accelerators-collector-config\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.395088 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.395042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.395088 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.395073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.395088 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.395074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-root\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.395549 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.395118 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.395549 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.395176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.395549 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.395218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4z22c\" (UniqueName: \"kubernetes.io/projected/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-api-access-4z22c\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.395549 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.395258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.395549 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.395286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-metrics-client-ca\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.395805 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:35.395661 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 14:01:35.395805 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:35.395723 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-tls podName:d6354208-1c66-449f-90c8-56fb4357138b nodeName:}" failed. No retries permitted until 2026-04-16 14:01:35.895704148 +0000 UTC m=+163.231803951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-78656" (UID: "d6354208-1c66-449f-90c8-56fb4357138b") : secret "openshift-state-metrics-tls" not found Apr 16 14:01:35.395938 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.395917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-metrics-client-ca\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.396045 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:35.396028 2575 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 14:01:35.396120 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:35.396112 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-tls podName:e6ba7a4f-4060-490f-9b01-36f9ae7b1d10 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:35.896079355 +0000 UTC m=+163.232179160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-64pd9" (UID: "e6ba7a4f-4060-490f-9b01-36f9ae7b1d10") : secret "kube-state-metrics-tls" not found Apr 16 14:01:35.396686 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.396658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.398215 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.397271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.398215 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.397492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-accelerators-collector-config\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.398215 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.397646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.398215 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.397896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-textfile\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.398215 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.397958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-sys\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.398215 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.398171 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6354208-1c66-449f-90c8-56fb4357138b-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.398741 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.398719 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-wtmp\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.401696 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.401671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.402087 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.402061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.403950 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.403903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh97t\" (UniqueName: \"kubernetes.io/projected/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-kube-api-access-zh97t\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.405009 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.404983 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z22c\" (UniqueName: \"kubernetes.io/projected/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-api-access-4z22c\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.406498 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.406433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445-node-exporter-tls\") pod \"node-exporter-2869s\" (UID: \"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445\") " pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.407966 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.407924 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.412059 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.412024 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csmv2\" (UniqueName: \"kubernetes.io/projected/d6354208-1c66-449f-90c8-56fb4357138b-kube-api-access-csmv2\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.548538 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.548460 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2869s" Apr 16 14:01:35.561940 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:35.561831 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdc2f3a2_aa77_4d1b_9e82_b2a46a5c6445.slice/crio-7abe0971b95e0df868570fa48737e95205084cb330892277f08e8465dca329ab WatchSource:0}: Error finding container 7abe0971b95e0df868570fa48737e95205084cb330892277f08e8465dca329ab: Status 404 returned error can't find the container with id 7abe0971b95e0df868570fa48737e95205084cb330892277f08e8465dca329ab Apr 16 14:01:35.620499 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.620445 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2869s" event={"ID":"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445","Type":"ContainerStarted","Data":"7abe0971b95e0df868570fa48737e95205084cb330892277f08e8465dca329ab"} Apr 16 14:01:35.899823 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.899783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:35.900012 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.899941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:35.900103 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:35.900073 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 14:01:35.900183 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:35.900170 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-tls podName:d6354208-1c66-449f-90c8-56fb4357138b nodeName:}" failed. No retries permitted until 2026-04-16 14:01:36.900149206 +0000 UTC m=+164.236249007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-78656" (UID: "d6354208-1c66-449f-90c8-56fb4357138b") : secret "openshift-state-metrics-tls" not found Apr 16 14:01:35.903139 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:35.903072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6ba7a4f-4060-490f-9b01-36f9ae7b1d10-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-64pd9\" (UID: \"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:36.140125 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:36.140080 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" Apr 16 14:01:36.660951 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:36.660908 2575 patch_prober.go:28] interesting pod/image-registry-bc58db64c-htv7v container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:36.661352 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:36.660972 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" podUID="74588845-f50d-4443-a6d6-c3b0c18f9b81" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:36.909157 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:36.909125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:36.911669 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:36.911607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6354208-1c66-449f-90c8-56fb4357138b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-78656\" (UID: \"d6354208-1c66-449f-90c8-56fb4357138b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:37.023257 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.023214 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" Apr 16 14:01:37.161208 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.161176 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-64pd9"] Apr 16 14:01:37.173354 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:37.173323 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6ba7a4f_4060_490f_9b01_36f9ae7b1d10.slice/crio-f2817837af2279004991dcf24ed990b8e1452e014ca828745a9f1015fd618e66 WatchSource:0}: Error finding container f2817837af2279004991dcf24ed990b8e1452e014ca828745a9f1015fd618e66: Status 404 returned error can't find the container with id f2817837af2279004991dcf24ed990b8e1452e014ca828745a9f1015fd618e66 Apr 16 14:01:37.179201 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.179114 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-78656"] Apr 16 14:01:37.187235 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:37.186811 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6354208_1c66_449f_90c8_56fb4357138b.slice/crio-008e3cf4bc4c71c0dfa9c8289a22be264ce7a08138c2232834c1c47022c4e621 WatchSource:0}: Error finding container 008e3cf4bc4c71c0dfa9c8289a22be264ce7a08138c2232834c1c47022c4e621: Status 404 returned error can't find the container with id 008e3cf4bc4c71c0dfa9c8289a22be264ce7a08138c2232834c1c47022c4e621 Apr 16 14:01:37.627815 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.627781 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q9qxf" event={"ID":"2fbaae4a-0d84-4121-bda1-d36fab54ac7d","Type":"ContainerStarted","Data":"10865cd5cea55050b251ba887e0aa69222f7a812cb6020480c4da55d20835936"} Apr 16 14:01:37.629352 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.629324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" event={"ID":"d6354208-1c66-449f-90c8-56fb4357138b","Type":"ContainerStarted","Data":"f4600d483ae757a8ff9a3588e5d7ef7f0219a933e007e5dc354e24def9de503e"} Apr 16 14:01:37.629449 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.629354 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" event={"ID":"d6354208-1c66-449f-90c8-56fb4357138b","Type":"ContainerStarted","Data":"64922606736bffe4190fc3b1bfe98f3861c0c826e6ecae6171c85c944d4d8dea"} Apr 16 14:01:37.629449 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.629366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" event={"ID":"d6354208-1c66-449f-90c8-56fb4357138b","Type":"ContainerStarted","Data":"008e3cf4bc4c71c0dfa9c8289a22be264ce7a08138c2232834c1c47022c4e621"} Apr 16 14:01:37.630755 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.630727 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zhlzs" event={"ID":"fbe5971a-6df2-42bd-b7eb-09f552154f0d","Type":"ContainerStarted","Data":"a744a2736f67bd471b35eddbced40e31ed889a87ba9644140b600ac71e9fd9db"} Apr 16 14:01:37.630871 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.630757 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zhlzs" event={"ID":"fbe5971a-6df2-42bd-b7eb-09f552154f0d","Type":"ContainerStarted","Data":"df1b1fe9e2c3e6f0dda164e223c890de320edb6e19fea7089ccd6381cb6dae86"} Apr 16 14:01:37.630937 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.630922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zhlzs" Apr 16 14:01:37.632007 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.631985 2575 generic.go:358] "Generic (PLEG): container finished" podID="fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445" containerID="c7bb72aef48db78c8026f0c178ed5b3eaca8ccb610a60a0a8bc380461fd16dfc" exitCode=0 Apr 16 14:01:37.632127 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.632052 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2869s" event={"ID":"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445","Type":"ContainerDied","Data":"c7bb72aef48db78c8026f0c178ed5b3eaca8ccb610a60a0a8bc380461fd16dfc"} Apr 16 14:01:37.633022 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.633004 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" event={"ID":"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10","Type":"ContainerStarted","Data":"f2817837af2279004991dcf24ed990b8e1452e014ca828745a9f1015fd618e66"} Apr 16 14:01:37.645120 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.645044 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-q9qxf" podStartSLOduration=129.1825904 podStartE2EDuration="2m12.64503145s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 14:01:33.554324632 +0000 UTC m=+160.890424431" lastFinishedPulling="2026-04-16 14:01:37.016765678 +0000 UTC m=+164.352865481" observedRunningTime="2026-04-16 14:01:37.643681709 +0000 UTC m=+164.979781531" watchObservedRunningTime="2026-04-16 14:01:37.64503145 +0000 UTC m=+164.981131271" Apr 16 14:01:37.675306 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:37.675265 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zhlzs" podStartSLOduration=129.200087967 podStartE2EDuration="2m12.675246837s" podCreationTimestamp="2026-04-16 13:59:25 +0000 UTC" firstStartedPulling="2026-04-16 14:01:33.541605286 +0000 UTC m=+160.877705100" lastFinishedPulling="2026-04-16 14:01:37.016764167 +0000 UTC m=+164.352863970" observedRunningTime="2026-04-16 14:01:37.674278967 +0000 UTC m=+165.010378791" watchObservedRunningTime="2026-04-16 14:01:37.675246837 +0000 UTC m=+165.011346661" Apr 16 14:01:38.640800 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:38.640760 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2869s" event={"ID":"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445","Type":"ContainerStarted","Data":"29caf73e3cf3c7f030a6db343a1a64161321c4d6cdd9c8c24990a60e89ed75c3"} Apr 16 14:01:38.640800 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:38.640806 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2869s" event={"ID":"fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445","Type":"ContainerStarted","Data":"9a04a4bdf61bf2027745f9a7c5f8e37472c11d222d444f0527633828f47db57b"} Apr 16 14:01:38.659444 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:38.659387 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2869s" podStartSLOduration=2.154866553 podStartE2EDuration="3.659370228s" podCreationTimestamp="2026-04-16 14:01:35 +0000 UTC" firstStartedPulling="2026-04-16 14:01:35.564924429 +0000 UTC m=+162.901024236" lastFinishedPulling="2026-04-16 14:01:37.069428108 +0000 UTC m=+164.405527911" observedRunningTime="2026-04-16 14:01:38.658961733 +0000 UTC m=+165.995061565" watchObservedRunningTime="2026-04-16 14:01:38.659370228 +0000 UTC m=+165.995470031" Apr 16 14:01:39.646200 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:39.646159 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" event={"ID":"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10","Type":"ContainerStarted","Data":"a5df193465edcebbacb9106e01cfe9312d50606e11ea8ef4061c145fc9bbab84"} Apr 16 14:01:39.646200 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:39.646205 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" event={"ID":"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10","Type":"ContainerStarted","Data":"5e9a543fc17249ec24a3cc5cf8cb8548bb23ded8c5867af13959fad17e847890"} Apr 16 14:01:39.646754 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:39.646228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" event={"ID":"e6ba7a4f-4060-490f-9b01-36f9ae7b1d10","Type":"ContainerStarted","Data":"78a4b4f622fa04715513dafcadde36d954acc952817667b1ff4920992609693f"} Apr 16 14:01:39.663761 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:39.663699 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-64pd9" podStartSLOduration=3.146575596 podStartE2EDuration="4.66368139s" podCreationTimestamp="2026-04-16 14:01:35 +0000 UTC" firstStartedPulling="2026-04-16 14:01:37.175638866 +0000 UTC m=+164.511738672" lastFinishedPulling="2026-04-16 14:01:38.692744663 +0000 UTC m=+166.028844466" observedRunningTime="2026-04-16 14:01:39.662497245 +0000 UTC m=+166.998597081" watchObservedRunningTime="2026-04-16 14:01:39.66368139 +0000 UTC m=+166.999781213" Apr 16 14:01:40.651174 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:40.651141 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" event={"ID":"d6354208-1c66-449f-90c8-56fb4357138b","Type":"ContainerStarted","Data":"1f87e8c2aecc4aa698eebc6419cf107cd98ef0da921699cb12d2c7b6e3479e10"} Apr 16 14:01:40.668959 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:40.668911 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-78656" podStartSLOduration=3.049178829 podStartE2EDuration="5.668895699s" podCreationTimestamp="2026-04-16 14:01:35 +0000 UTC" firstStartedPulling="2026-04-16 14:01:37.366538692 +0000 UTC m=+164.702638492" lastFinishedPulling="2026-04-16 14:01:39.986255558 +0000 UTC m=+167.322355362" observedRunningTime="2026-04-16 14:01:40.666995129 +0000 UTC m=+168.003094952" watchObservedRunningTime="2026-04-16 14:01:40.668895699 +0000 UTC m=+168.004995514" Apr 16 14:01:43.203754 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:43.203676 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 14:01:46.660262 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:46.660212 2575 patch_prober.go:28] interesting pod/image-registry-bc58db64c-htv7v container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:46.660774 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:46.660287 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" podUID="74588845-f50d-4443-a6d6-c3b0c18f9b81" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:47.642932 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:47.642897 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zhlzs" Apr 16 14:01:51.673141 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:51.673057 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" podUID="74588845-f50d-4443-a6d6-c3b0c18f9b81" containerName="registry" containerID="cri-o://09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a" gracePeriod=30 Apr 16 14:01:51.902478 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:51.902455 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:52.057273 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.057184 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74588845-f50d-4443-a6d6-c3b0c18f9b81-ca-trust-extracted\") pod \"74588845-f50d-4443-a6d6-c3b0c18f9b81\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " Apr 16 14:01:52.057273 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.057235 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-certificates\") pod \"74588845-f50d-4443-a6d6-c3b0c18f9b81\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " Apr 16 14:01:52.057487 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.057279 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-image-registry-private-configuration\") pod \"74588845-f50d-4443-a6d6-c3b0c18f9b81\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " Apr 16 14:01:52.057487 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.057313 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbp6j\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-kube-api-access-wbp6j\") pod \"74588845-f50d-4443-a6d6-c3b0c18f9b81\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " Apr 16 14:01:52.057599 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.057487 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-bound-sa-token\") pod \"74588845-f50d-4443-a6d6-c3b0c18f9b81\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " Apr 16 14:01:52.057599 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.057551 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls\") pod \"74588845-f50d-4443-a6d6-c3b0c18f9b81\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " Apr 16 14:01:52.057599 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.057582 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-installation-pull-secrets\") pod \"74588845-f50d-4443-a6d6-c3b0c18f9b81\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " Apr 16 14:01:52.057731 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.057615 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-trusted-ca\") pod \"74588845-f50d-4443-a6d6-c3b0c18f9b81\" (UID: \"74588845-f50d-4443-a6d6-c3b0c18f9b81\") " Apr 16 14:01:52.057844 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.057813 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "74588845-f50d-4443-a6d6-c3b0c18f9b81" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:52.058376 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.058346 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "74588845-f50d-4443-a6d6-c3b0c18f9b81" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:52.059830 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.059793 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "74588845-f50d-4443-a6d6-c3b0c18f9b81" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:52.059917 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.059858 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "74588845-f50d-4443-a6d6-c3b0c18f9b81" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:52.059983 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.059921 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "74588845-f50d-4443-a6d6-c3b0c18f9b81" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:52.059983 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.059927 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-kube-api-access-wbp6j" (OuterVolumeSpecName: "kube-api-access-wbp6j") pod "74588845-f50d-4443-a6d6-c3b0c18f9b81" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81"). InnerVolumeSpecName "kube-api-access-wbp6j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:52.060302 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.060283 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "74588845-f50d-4443-a6d6-c3b0c18f9b81" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:52.065929 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.065905 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74588845-f50d-4443-a6d6-c3b0c18f9b81-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "74588845-f50d-4443-a6d6-c3b0c18f9b81" (UID: "74588845-f50d-4443-a6d6-c3b0c18f9b81"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:01:52.158471 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.158418 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-tls\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:01:52.158471 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.158465 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-installation-pull-secrets\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:01:52.158471 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.158479 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-trusted-ca\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:01:52.158471 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.158488 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74588845-f50d-4443-a6d6-c3b0c18f9b81-ca-trust-extracted\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:01:52.158738 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.158497 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74588845-f50d-4443-a6d6-c3b0c18f9b81-registry-certificates\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:01:52.158738 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.158506 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74588845-f50d-4443-a6d6-c3b0c18f9b81-image-registry-private-configuration\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:01:52.158738 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.158516 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbp6j\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-kube-api-access-wbp6j\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:01:52.158738 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.158525 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74588845-f50d-4443-a6d6-c3b0c18f9b81-bound-sa-token\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:01:52.683951 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.683912 2575 generic.go:358] "Generic (PLEG): container finished" podID="74588845-f50d-4443-a6d6-c3b0c18f9b81" containerID="09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a" exitCode=0 Apr 16 14:01:52.684364 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.683983 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" Apr 16 14:01:52.684364 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.684000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" event={"ID":"74588845-f50d-4443-a6d6-c3b0c18f9b81","Type":"ContainerDied","Data":"09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a"} Apr 16 14:01:52.684364 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.684040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bc58db64c-htv7v" event={"ID":"74588845-f50d-4443-a6d6-c3b0c18f9b81","Type":"ContainerDied","Data":"932eb04394fcbb67863373c02cb17d82ac668334f03fbb7b743adfc2cca0943f"} Apr 16 14:01:52.684364 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.684057 2575 scope.go:117] "RemoveContainer" containerID="09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a" Apr 16 14:01:52.691999 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.691974 2575 scope.go:117] "RemoveContainer" containerID="09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a" Apr 16 14:01:52.692284 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:01:52.692250 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a\": container with ID starting with 09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a not found: ID does not exist" containerID="09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a" Apr 16 14:01:52.692332 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.692292 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a"} err="failed to get container status \"09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a\": rpc error: code = NotFound desc = could not find container \"09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a\": container with ID starting with 09c2662239de3944ad21e615cc71c6428614cf36f6d2e6f0c542483ae9d7b33a not found: ID does not exist" Apr 16 14:01:52.702115 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.702068 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66db9bd8b5-42xrj"] Apr 16 14:01:52.702355 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.702342 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74588845-f50d-4443-a6d6-c3b0c18f9b81" containerName="registry" Apr 16 14:01:52.702398 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.702357 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="74588845-f50d-4443-a6d6-c3b0c18f9b81" containerName="registry" Apr 16 14:01:52.702435 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.702397 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="74588845-f50d-4443-a6d6-c3b0c18f9b81" containerName="registry" Apr 16 14:01:52.709069 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.709050 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.709754 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.709724 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bc58db64c-htv7v"] Apr 16 14:01:52.711579 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.711563 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:01:52.711731 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.711715 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:01:52.711798 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.711732 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:01:52.712127 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.712088 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:01:52.712221 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.712174 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:01:52.712221 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.712193 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:01:52.712461 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.712444 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:01:52.712555 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.712534 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-5mpjr\"" Apr 16 14:01:52.719022 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.718059 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66db9bd8b5-42xrj"] Apr 16 14:01:52.721897 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.720557 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-bc58db64c-htv7v"] Apr 16 14:01:52.863053 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.863012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-config\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.863053 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.863058 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-oauth-config\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.863257 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.863130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-service-ca\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.863257 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.863200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47t9s\" (UniqueName: \"kubernetes.io/projected/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-kube-api-access-47t9s\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.863257 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.863236 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-oauth-serving-cert\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.863353 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.863263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-serving-cert\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.964474 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.964383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-config\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.964474 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.964426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-oauth-config\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.964474 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.964449 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-service-ca\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.964474 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.964474 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47t9s\" (UniqueName: \"kubernetes.io/projected/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-kube-api-access-47t9s\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.964716 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.964500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-oauth-serving-cert\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.964716 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.964522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-serving-cert\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.965269 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.965242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-service-ca\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.965371 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.965242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-config\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.965371 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.965288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-oauth-serving-cert\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.966997 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.966980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-oauth-config\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.967089 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.966995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-serving-cert\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:52.972199 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:52.972175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47t9s\" (UniqueName: \"kubernetes.io/projected/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-kube-api-access-47t9s\") pod \"console-66db9bd8b5-42xrj\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:53.020633 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:53.020598 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:01:53.141143 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:53.141108 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66db9bd8b5-42xrj"] Apr 16 14:01:53.144699 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:01:53.144674 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c764976_5174_4a5d_ad3f_7c879e6b6ad6.slice/crio-7c19633d5d5a2a077f45d17fb44e12c55b657cb3e240019348ddc73cf1981edf WatchSource:0}: Error finding container 7c19633d5d5a2a077f45d17fb44e12c55b657cb3e240019348ddc73cf1981edf: Status 404 returned error can't find the container with id 7c19633d5d5a2a077f45d17fb44e12c55b657cb3e240019348ddc73cf1981edf Apr 16 14:01:53.205767 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:53.205739 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74588845-f50d-4443-a6d6-c3b0c18f9b81" path="/var/lib/kubelet/pods/74588845-f50d-4443-a6d6-c3b0c18f9b81/volumes" Apr 16 14:01:53.689036 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:53.688970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66db9bd8b5-42xrj" event={"ID":"3c764976-5174-4a5d-ad3f-7c879e6b6ad6","Type":"ContainerStarted","Data":"7c19633d5d5a2a077f45d17fb44e12c55b657cb3e240019348ddc73cf1981edf"} Apr 16 14:01:56.697561 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:56.697524 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66db9bd8b5-42xrj" event={"ID":"3c764976-5174-4a5d-ad3f-7c879e6b6ad6","Type":"ContainerStarted","Data":"6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53"} Apr 16 14:01:56.719063 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:56.719012 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66db9bd8b5-42xrj" podStartSLOduration=2.130864377 podStartE2EDuration="4.718996397s" podCreationTimestamp="2026-04-16 14:01:52 +0000 UTC" firstStartedPulling="2026-04-16 14:01:53.146513305 +0000 UTC m=+180.482613105" lastFinishedPulling="2026-04-16 14:01:55.734645313 +0000 UTC m=+183.070745125" observedRunningTime="2026-04-16 14:01:56.718067652 +0000 UTC m=+184.054167475" watchObservedRunningTime="2026-04-16 14:01:56.718996397 +0000 UTC m=+184.055096218" Apr 16 14:01:59.892410 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:59.892374 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-95b87756b-fm8b6"] Apr 16 14:01:59.896441 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:59.896423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:01:59.903803 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:59.903585 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:01:59.906737 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:01:59.906714 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-95b87756b-fm8b6"] Apr 16 14:02:00.023807 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.023767 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grmqm\" (UniqueName: \"kubernetes.io/projected/68b39da3-d650-46a9-a633-504de8fa67d7-kube-api-access-grmqm\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.023992 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.023821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-serving-cert\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.023992 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.023889 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-oauth-serving-cert\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.023992 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.023963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-trusted-ca-bundle\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.024135 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.023994 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-console-config\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.024196 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.024022 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-oauth-config\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.028034 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.024499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-service-ca\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.124854 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.124803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-oauth-config\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.125055 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.124881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-service-ca\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.125055 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.124923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grmqm\" (UniqueName: \"kubernetes.io/projected/68b39da3-d650-46a9-a633-504de8fa67d7-kube-api-access-grmqm\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.125055 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.124967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-serving-cert\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.125055 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.125003 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-oauth-serving-cert\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.125055 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.125044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-trusted-ca-bundle\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.125370 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.125241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-console-config\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.125711 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.125686 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-service-ca\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.125839 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.125814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-oauth-serving-cert\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.125900 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.125871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-trusted-ca-bundle\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.125957 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.125936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-console-config\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.127959 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.127933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-oauth-config\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.128045 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.127939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-serving-cert\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.132366 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.132348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grmqm\" (UniqueName: \"kubernetes.io/projected/68b39da3-d650-46a9-a633-504de8fa67d7-kube-api-access-grmqm\") pod \"console-95b87756b-fm8b6\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.206516 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.206423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:00.322550 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.322516 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-95b87756b-fm8b6"] Apr 16 14:02:00.326375 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:02:00.326343 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68b39da3_d650_46a9_a633_504de8fa67d7.slice/crio-0803b3bcf313bd1d473e3e2f30545f56df6f87fa0c149a7623e85c5891a60932 WatchSource:0}: Error finding container 0803b3bcf313bd1d473e3e2f30545f56df6f87fa0c149a7623e85c5891a60932: Status 404 returned error can't find the container with id 0803b3bcf313bd1d473e3e2f30545f56df6f87fa0c149a7623e85c5891a60932 Apr 16 14:02:00.709390 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.709349 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95b87756b-fm8b6" event={"ID":"68b39da3-d650-46a9-a633-504de8fa67d7","Type":"ContainerStarted","Data":"981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53"} Apr 16 14:02:00.709390 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.709392 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95b87756b-fm8b6" event={"ID":"68b39da3-d650-46a9-a633-504de8fa67d7","Type":"ContainerStarted","Data":"0803b3bcf313bd1d473e3e2f30545f56df6f87fa0c149a7623e85c5891a60932"} Apr 16 14:02:00.743583 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:00.743530 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-95b87756b-fm8b6" podStartSLOduration=1.743517007 podStartE2EDuration="1.743517007s" podCreationTimestamp="2026-04-16 14:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:02:00.742222233 +0000 UTC m=+188.078322037" watchObservedRunningTime="2026-04-16 14:02:00.743517007 +0000 UTC m=+188.079616829" Apr 16 14:02:03.021445 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:03.021419 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:02:03.021814 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:03.021457 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:02:03.026043 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:03.026022 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:02:03.720679 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:03.720653 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:02:09.733964 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:09.733932 2575 generic.go:358] "Generic (PLEG): container finished" podID="6daf2345-5eea-4faa-9720-9390a947e6ce" containerID="f0978bb32a827a1a7aff8d1dd0c2f7e7ed7863ddb9e6b67c1a9ca43c32fff103" exitCode=0 Apr 16 14:02:09.734601 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:09.734010 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" event={"ID":"6daf2345-5eea-4faa-9720-9390a947e6ce","Type":"ContainerDied","Data":"f0978bb32a827a1a7aff8d1dd0c2f7e7ed7863ddb9e6b67c1a9ca43c32fff103"} Apr 16 14:02:09.734601 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:09.734462 2575 scope.go:117] "RemoveContainer" containerID="f0978bb32a827a1a7aff8d1dd0c2f7e7ed7863ddb9e6b67c1a9ca43c32fff103" Apr 16 14:02:10.206817 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:10.206777 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:10.206817 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:10.206826 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:10.211495 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:10.211473 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:10.737865 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:10.737827 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-t8smf" event={"ID":"6daf2345-5eea-4faa-9720-9390a947e6ce","Type":"ContainerStarted","Data":"14488ff5899b7d4f1d0eb6a2330c9273d3de544f45d791c588cca2909cbd89ef"} Apr 16 14:02:10.738994 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:10.738971 2575 generic.go:358] "Generic (PLEG): container finished" podID="e8f6678c-3de8-4562-b077-c6a1db9f26a6" containerID="0c5913743743dc25120a65989c8dcffeccdc616affe182162dc550726bef6eae" exitCode=0 Apr 16 14:02:10.739114 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:10.739045 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" event={"ID":"e8f6678c-3de8-4562-b077-c6a1db9f26a6","Type":"ContainerDied","Data":"0c5913743743dc25120a65989c8dcffeccdc616affe182162dc550726bef6eae"} Apr 16 14:02:10.739401 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:10.739386 2575 scope.go:117] "RemoveContainer" containerID="0c5913743743dc25120a65989c8dcffeccdc616affe182162dc550726bef6eae" Apr 16 14:02:10.742996 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:10.742975 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:02:10.798608 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:10.798587 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66db9bd8b5-42xrj"] Apr 16 14:02:11.743182 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:11.743144 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-tjxm7" event={"ID":"e8f6678c-3de8-4562-b077-c6a1db9f26a6","Type":"ContainerStarted","Data":"e1847d3a97a900ee25f7efef813c44d4e9f48541487ed439f8f96bceb23096ba"} Apr 16 14:02:30.797183 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:30.797148 2575 generic.go:358] "Generic (PLEG): container finished" podID="3994a6a1-6bea-406a-a079-922dfadd77da" containerID="3abacd27a672719eaec93610b8cf2b6e3c4e129715209ddcfc63aa011b68ed65" exitCode=0 Apr 16 14:02:30.797614 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:30.797221 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" event={"ID":"3994a6a1-6bea-406a-a079-922dfadd77da","Type":"ContainerDied","Data":"3abacd27a672719eaec93610b8cf2b6e3c4e129715209ddcfc63aa011b68ed65"} Apr 16 14:02:30.797614 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:30.797571 2575 scope.go:117] "RemoveContainer" containerID="3abacd27a672719eaec93610b8cf2b6e3c4e129715209ddcfc63aa011b68ed65" Apr 16 14:02:31.802043 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:31.802007 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-497ks" event={"ID":"3994a6a1-6bea-406a-a079-922dfadd77da","Type":"ContainerStarted","Data":"74db8e8689374df328bc85fe1886f941ec53813721c8ae7c03eb13a9026cd692"} Apr 16 14:02:35.820708 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:35.820671 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66db9bd8b5-42xrj" podUID="3c764976-5174-4a5d-ad3f-7c879e6b6ad6" containerName="console" containerID="cri-o://6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53" gracePeriod=15 Apr 16 14:02:36.049794 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.049772 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66db9bd8b5-42xrj_3c764976-5174-4a5d-ad3f-7c879e6b6ad6/console/0.log" Apr 16 14:02:36.049906 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.049833 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:02:36.129607 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.129572 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-serving-cert\") pod \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " Apr 16 14:02:36.129755 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.129640 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-oauth-serving-cert\") pod \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " Apr 16 14:02:36.129755 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.129663 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-oauth-config\") pod \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " Apr 16 14:02:36.129755 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.129701 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-service-ca\") pod \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " Apr 16 14:02:36.129755 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.129736 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-config\") pod \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " Apr 16 14:02:36.129895 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.129794 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47t9s\" (UniqueName: \"kubernetes.io/projected/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-kube-api-access-47t9s\") pod \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\" (UID: \"3c764976-5174-4a5d-ad3f-7c879e6b6ad6\") " Apr 16 14:02:36.130149 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.130120 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3c764976-5174-4a5d-ad3f-7c879e6b6ad6" (UID: "3c764976-5174-4a5d-ad3f-7c879e6b6ad6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:36.130275 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.130116 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-service-ca" (OuterVolumeSpecName: "service-ca") pod "3c764976-5174-4a5d-ad3f-7c879e6b6ad6" (UID: "3c764976-5174-4a5d-ad3f-7c879e6b6ad6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:36.130275 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.130151 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-config" (OuterVolumeSpecName: "console-config") pod "3c764976-5174-4a5d-ad3f-7c879e6b6ad6" (UID: "3c764976-5174-4a5d-ad3f-7c879e6b6ad6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:36.131974 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.131951 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3c764976-5174-4a5d-ad3f-7c879e6b6ad6" (UID: "3c764976-5174-4a5d-ad3f-7c879e6b6ad6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:36.132072 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.131991 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-kube-api-access-47t9s" (OuterVolumeSpecName: "kube-api-access-47t9s") pod "3c764976-5174-4a5d-ad3f-7c879e6b6ad6" (UID: "3c764976-5174-4a5d-ad3f-7c879e6b6ad6"). InnerVolumeSpecName "kube-api-access-47t9s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:36.132072 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.132005 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3c764976-5174-4a5d-ad3f-7c879e6b6ad6" (UID: "3c764976-5174-4a5d-ad3f-7c879e6b6ad6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:36.230713 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.230679 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-oauth-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.230713 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.230711 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-oauth-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.230713 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.230722 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-service-ca\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.230936 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.230733 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.230936 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.230742 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47t9s\" (UniqueName: \"kubernetes.io/projected/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-kube-api-access-47t9s\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.230936 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.230751 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c764976-5174-4a5d-ad3f-7c879e6b6ad6-console-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:02:36.818085 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.818058 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66db9bd8b5-42xrj_3c764976-5174-4a5d-ad3f-7c879e6b6ad6/console/0.log" Apr 16 14:02:36.818278 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.818124 2575 generic.go:358] "Generic (PLEG): container finished" podID="3c764976-5174-4a5d-ad3f-7c879e6b6ad6" containerID="6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53" exitCode=2 Apr 16 14:02:36.818278 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.818194 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66db9bd8b5-42xrj" event={"ID":"3c764976-5174-4a5d-ad3f-7c879e6b6ad6","Type":"ContainerDied","Data":"6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53"} Apr 16 14:02:36.818278 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.818230 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66db9bd8b5-42xrj" event={"ID":"3c764976-5174-4a5d-ad3f-7c879e6b6ad6","Type":"ContainerDied","Data":"7c19633d5d5a2a077f45d17fb44e12c55b657cb3e240019348ddc73cf1981edf"} Apr 16 14:02:36.818278 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.818251 2575 scope.go:117] "RemoveContainer" containerID="6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53" Apr 16 14:02:36.818278 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.818255 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66db9bd8b5-42xrj" Apr 16 14:02:36.826117 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.826002 2575 scope.go:117] "RemoveContainer" containerID="6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53" Apr 16 14:02:36.826378 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:02:36.826294 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53\": container with ID starting with 6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53 not found: ID does not exist" containerID="6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53" Apr 16 14:02:36.826378 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.826319 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53"} err="failed to get container status \"6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53\": rpc error: code = NotFound desc = could not find container \"6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53\": container with ID starting with 6d4813dda58dc8ec18535fd78c46ce45552381feec74940d065b396067d7cc53 not found: ID does not exist" Apr 16 14:02:36.837973 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.837948 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66db9bd8b5-42xrj"] Apr 16 14:02:36.844773 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:36.844749 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66db9bd8b5-42xrj"] Apr 16 14:02:37.206108 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:02:37.206056 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c764976-5174-4a5d-ad3f-7c879e6b6ad6" path="/var/lib/kubelet/pods/3c764976-5174-4a5d-ad3f-7c879e6b6ad6/volumes" Apr 16 14:03:03.965344 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:03.965300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 14:03:03.967619 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:03.967600 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e-metrics-certs\") pod \"network-metrics-daemon-ckxrz\" (UID: \"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e\") " pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 14:03:04.162427 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.162395 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7795cbb858-7kxnr"] Apr 16 14:03:04.162682 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.162669 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c764976-5174-4a5d-ad3f-7c879e6b6ad6" containerName="console" Apr 16 14:03:04.162733 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.162684 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c764976-5174-4a5d-ad3f-7c879e6b6ad6" containerName="console" Apr 16 14:03:04.162733 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.162731 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c764976-5174-4a5d-ad3f-7c879e6b6ad6" containerName="console" Apr 16 14:03:04.166313 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.166294 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.176510 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.176482 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7795cbb858-7kxnr"] Apr 16 14:03:04.206275 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.206245 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b98gv\"" Apr 16 14:03:04.215170 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.215149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ckxrz" Apr 16 14:03:04.267462 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.267213 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-trusted-ca-bundle\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.267462 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.267247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-service-ca\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.267462 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.267270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-serving-cert\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.267462 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.267300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-config\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.267462 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.267337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-oauth-config\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.267462 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.267359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhw4v\" (UniqueName: \"kubernetes.io/projected/50b6aeeb-f938-4d47-9337-607bbcc882a6-kube-api-access-qhw4v\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.267462 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.267375 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-oauth-serving-cert\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.333464 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.333431 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ckxrz"] Apr 16 14:03:04.336018 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:03:04.335985 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b32b297_b5cc_4ce8_8fba_75b7d4a02b3e.slice/crio-eaf84d2abec3dd2949b07b14a1261bbd9254da048a8b511f9fc3bd4a4a4700af WatchSource:0}: Error finding container eaf84d2abec3dd2949b07b14a1261bbd9254da048a8b511f9fc3bd4a4a4700af: Status 404 returned error can't find the container with id eaf84d2abec3dd2949b07b14a1261bbd9254da048a8b511f9fc3bd4a4a4700af Apr 16 14:03:04.368399 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.368366 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-config\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.368525 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.368419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-oauth-config\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.368525 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.368445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhw4v\" (UniqueName: \"kubernetes.io/projected/50b6aeeb-f938-4d47-9337-607bbcc882a6-kube-api-access-qhw4v\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.368525 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.368467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-oauth-serving-cert\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.368525 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.368493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-trusted-ca-bundle\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.368525 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.368516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-service-ca\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.368795 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.368563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-serving-cert\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.369195 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.369169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-service-ca\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.369291 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.369237 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-config\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.369347 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.369329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-oauth-serving-cert\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.369382 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.369341 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-trusted-ca-bundle\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.371495 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.371471 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-oauth-config\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.371572 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.371482 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-serving-cert\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.376420 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.376398 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhw4v\" (UniqueName: \"kubernetes.io/projected/50b6aeeb-f938-4d47-9337-607bbcc882a6-kube-api-access-qhw4v\") pod \"console-7795cbb858-7kxnr\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.476688 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.476593 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:04.592313 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.592278 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7795cbb858-7kxnr"] Apr 16 14:03:04.595921 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:03:04.595889 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50b6aeeb_f938_4d47_9337_607bbcc882a6.slice/crio-9a22656aa3b60848403e15190e37ee23752aa009ad5882aefde5b44263a1df0b WatchSource:0}: Error finding container 9a22656aa3b60848403e15190e37ee23752aa009ad5882aefde5b44263a1df0b: Status 404 returned error can't find the container with id 9a22656aa3b60848403e15190e37ee23752aa009ad5882aefde5b44263a1df0b Apr 16 14:03:04.898577 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.898536 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7795cbb858-7kxnr" event={"ID":"50b6aeeb-f938-4d47-9337-607bbcc882a6","Type":"ContainerStarted","Data":"47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5"} Apr 16 14:03:04.898577 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.898584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7795cbb858-7kxnr" event={"ID":"50b6aeeb-f938-4d47-9337-607bbcc882a6","Type":"ContainerStarted","Data":"9a22656aa3b60848403e15190e37ee23752aa009ad5882aefde5b44263a1df0b"} Apr 16 14:03:04.899757 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.899734 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ckxrz" event={"ID":"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e","Type":"ContainerStarted","Data":"eaf84d2abec3dd2949b07b14a1261bbd9254da048a8b511f9fc3bd4a4a4700af"} Apr 16 14:03:04.915205 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:04.915160 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7795cbb858-7kxnr" podStartSLOduration=0.915145693 podStartE2EDuration="915.145693ms" podCreationTimestamp="2026-04-16 14:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:03:04.914298659 +0000 UTC m=+252.250398485" watchObservedRunningTime="2026-04-16 14:03:04.915145693 +0000 UTC m=+252.251245517" Apr 16 14:03:05.906298 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:05.906260 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ckxrz" event={"ID":"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e","Type":"ContainerStarted","Data":"dadf9c05b3cd727a604ad19e9b88ad795b6fba0264647f41d5ba3afc175a8a88"} Apr 16 14:03:05.906298 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:05.906299 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ckxrz" event={"ID":"0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e","Type":"ContainerStarted","Data":"4020eb89b3de9d744fd02d0041cf5c7dc85f4ac9759480406fbd0586265b7059"} Apr 16 14:03:05.923215 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:05.923030 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ckxrz" podStartSLOduration=251.919518616 podStartE2EDuration="4m12.923012625s" podCreationTimestamp="2026-04-16 13:58:53 +0000 UTC" firstStartedPulling="2026-04-16 14:03:04.337977561 +0000 UTC m=+251.674077361" lastFinishedPulling="2026-04-16 14:03:05.34147157 +0000 UTC m=+252.677571370" observedRunningTime="2026-04-16 14:03:05.922236385 +0000 UTC m=+253.258336211" watchObservedRunningTime="2026-04-16 14:03:05.923012625 +0000 UTC m=+253.259112450" Apr 16 14:03:14.477902 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:14.477811 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:14.477902 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:14.477856 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:14.482278 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:14.482257 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:14.935252 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:14.935223 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:03:14.978451 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:14.978409 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-95b87756b-fm8b6"] Apr 16 14:03:39.999793 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:39.999728 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-95b87756b-fm8b6" podUID="68b39da3-d650-46a9-a633-504de8fa67d7" containerName="console" containerID="cri-o://981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53" gracePeriod=15 Apr 16 14:03:40.239533 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.239508 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-95b87756b-fm8b6_68b39da3-d650-46a9-a633-504de8fa67d7/console/0.log" Apr 16 14:03:40.239646 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.239588 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:03:40.359574 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.359543 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-trusted-ca-bundle\") pod \"68b39da3-d650-46a9-a633-504de8fa67d7\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " Apr 16 14:03:40.359721 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.359584 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-console-config\") pod \"68b39da3-d650-46a9-a633-504de8fa67d7\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " Apr 16 14:03:40.359721 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.359599 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-service-ca\") pod \"68b39da3-d650-46a9-a633-504de8fa67d7\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " Apr 16 14:03:40.359721 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.359623 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-oauth-serving-cert\") pod \"68b39da3-d650-46a9-a633-504de8fa67d7\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " Apr 16 14:03:40.359721 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.359670 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-serving-cert\") pod \"68b39da3-d650-46a9-a633-504de8fa67d7\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " Apr 16 14:03:40.359721 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.359701 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grmqm\" (UniqueName: \"kubernetes.io/projected/68b39da3-d650-46a9-a633-504de8fa67d7-kube-api-access-grmqm\") pod \"68b39da3-d650-46a9-a633-504de8fa67d7\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " Apr 16 14:03:40.359721 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.359718 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-oauth-config\") pod \"68b39da3-d650-46a9-a633-504de8fa67d7\" (UID: \"68b39da3-d650-46a9-a633-504de8fa67d7\") " Apr 16 14:03:40.360032 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.359976 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-console-config" (OuterVolumeSpecName: "console-config") pod "68b39da3-d650-46a9-a633-504de8fa67d7" (UID: "68b39da3-d650-46a9-a633-504de8fa67d7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:40.360087 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.360020 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-service-ca" (OuterVolumeSpecName: "service-ca") pod "68b39da3-d650-46a9-a633-504de8fa67d7" (UID: "68b39da3-d650-46a9-a633-504de8fa67d7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:40.360217 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.360194 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "68b39da3-d650-46a9-a633-504de8fa67d7" (UID: "68b39da3-d650-46a9-a633-504de8fa67d7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:40.360278 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.360262 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "68b39da3-d650-46a9-a633-504de8fa67d7" (UID: "68b39da3-d650-46a9-a633-504de8fa67d7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:40.361931 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.361905 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "68b39da3-d650-46a9-a633-504de8fa67d7" (UID: "68b39da3-d650-46a9-a633-504de8fa67d7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:40.362026 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.361936 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "68b39da3-d650-46a9-a633-504de8fa67d7" (UID: "68b39da3-d650-46a9-a633-504de8fa67d7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:40.362026 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.362008 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b39da3-d650-46a9-a633-504de8fa67d7-kube-api-access-grmqm" (OuterVolumeSpecName: "kube-api-access-grmqm") pod "68b39da3-d650-46a9-a633-504de8fa67d7" (UID: "68b39da3-d650-46a9-a633-504de8fa67d7"). InnerVolumeSpecName "kube-api-access-grmqm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:40.460598 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.460561 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-trusted-ca-bundle\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:03:40.460598 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.460592 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-console-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:03:40.460598 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.460602 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-service-ca\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:03:40.460811 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.460611 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68b39da3-d650-46a9-a633-504de8fa67d7-oauth-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:03:40.460811 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.460620 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:03:40.460811 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.460628 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-grmqm\" (UniqueName: \"kubernetes.io/projected/68b39da3-d650-46a9-a633-504de8fa67d7-kube-api-access-grmqm\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:03:40.460811 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:40.460653 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68b39da3-d650-46a9-a633-504de8fa67d7-console-oauth-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:03:41.008519 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:41.008489 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-95b87756b-fm8b6_68b39da3-d650-46a9-a633-504de8fa67d7/console/0.log" Apr 16 14:03:41.008938 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:41.008531 2575 generic.go:358] "Generic (PLEG): container finished" podID="68b39da3-d650-46a9-a633-504de8fa67d7" containerID="981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53" exitCode=2 Apr 16 14:03:41.008938 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:41.008572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95b87756b-fm8b6" event={"ID":"68b39da3-d650-46a9-a633-504de8fa67d7","Type":"ContainerDied","Data":"981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53"} Apr 16 14:03:41.008938 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:41.008604 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95b87756b-fm8b6" Apr 16 14:03:41.008938 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:41.008616 2575 scope.go:117] "RemoveContainer" containerID="981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53" Apr 16 14:03:41.008938 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:41.008604 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95b87756b-fm8b6" event={"ID":"68b39da3-d650-46a9-a633-504de8fa67d7","Type":"ContainerDied","Data":"0803b3bcf313bd1d473e3e2f30545f56df6f87fa0c149a7623e85c5891a60932"} Apr 16 14:03:41.016828 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:41.016807 2575 scope.go:117] "RemoveContainer" containerID="981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53" Apr 16 14:03:41.017117 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:03:41.017076 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53\": container with ID starting with 981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53 not found: ID does not exist" containerID="981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53" Apr 16 14:03:41.017192 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:41.017125 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53"} err="failed to get container status \"981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53\": rpc error: code = NotFound desc = could not find container \"981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53\": container with ID starting with 981e4cc8eec76a06ea1240bfcf904eb738cdc05727fa3ed891466aa5ee160e53 not found: ID does not exist" Apr 16 14:03:41.036740 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:41.036713 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-95b87756b-fm8b6"] Apr 16 14:03:41.040132 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:41.040111 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-95b87756b-fm8b6"] Apr 16 14:03:41.206634 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:41.206592 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b39da3-d650-46a9-a633-504de8fa67d7" path="/var/lib/kubelet/pods/68b39da3-d650-46a9-a633-504de8fa67d7/volumes" Apr 16 14:03:53.083881 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:53.083851 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 14:03:53.084471 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:53.084448 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 14:03:53.090247 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:03:53.090226 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:04:12.404148 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.404116 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b8b4cb66b-bdt8v"] Apr 16 14:04:12.406681 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.404513 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68b39da3-d650-46a9-a633-504de8fa67d7" containerName="console" Apr 16 14:04:12.406681 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.404530 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b39da3-d650-46a9-a633-504de8fa67d7" containerName="console" Apr 16 14:04:12.406681 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.404593 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="68b39da3-d650-46a9-a633-504de8fa67d7" containerName="console" Apr 16 14:04:12.407510 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.407491 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.415769 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.415747 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b8b4cb66b-bdt8v"] Apr 16 14:04:12.497996 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.497956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-serving-cert\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.497996 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.497998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-service-ca\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.498296 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.498015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-oauth-serving-cert\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.498296 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.498084 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-trusted-ca-bundle\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.498296 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.498149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-config\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.498296 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.498212 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-oauth-config\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.498296 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.498252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7k4z\" (UniqueName: \"kubernetes.io/projected/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-kube-api-access-f7k4z\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.599418 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.599382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-service-ca\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.599418 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.599417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-oauth-serving-cert\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.599641 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.599439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-trusted-ca-bundle\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.599641 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.599549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-config\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.599641 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.599626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-oauth-config\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.599855 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.599665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7k4z\" (UniqueName: \"kubernetes.io/projected/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-kube-api-access-f7k4z\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.599855 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.599718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-serving-cert\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.600268 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.600245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-service-ca\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.600354 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.600247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-oauth-serving-cert\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.600354 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.600247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-config\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.600437 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.600384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-trusted-ca-bundle\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.602644 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.602626 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-serving-cert\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.602743 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.602721 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-oauth-config\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.611217 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.611187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7k4z\" (UniqueName: \"kubernetes.io/projected/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-kube-api-access-f7k4z\") pod \"console-6b8b4cb66b-bdt8v\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.717517 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.717424 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:12.837123 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.837072 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b8b4cb66b-bdt8v"] Apr 16 14:04:12.840980 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:04:12.840949 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47dfbf16_5bbb_4abc_a41c_eafb4724d50a.slice/crio-022b09f7710683aeb47e10eec6d69251514d403a75fa595508fb6e4c6b40bd78 WatchSource:0}: Error finding container 022b09f7710683aeb47e10eec6d69251514d403a75fa595508fb6e4c6b40bd78: Status 404 returned error can't find the container with id 022b09f7710683aeb47e10eec6d69251514d403a75fa595508fb6e4c6b40bd78 Apr 16 14:04:12.842773 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:12.842756 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:04:13.096971 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:13.096884 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8b4cb66b-bdt8v" event={"ID":"47dfbf16-5bbb-4abc-a41c-eafb4724d50a","Type":"ContainerStarted","Data":"a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98"} Apr 16 14:04:13.096971 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:13.096920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8b4cb66b-bdt8v" event={"ID":"47dfbf16-5bbb-4abc-a41c-eafb4724d50a","Type":"ContainerStarted","Data":"022b09f7710683aeb47e10eec6d69251514d403a75fa595508fb6e4c6b40bd78"} Apr 16 14:04:13.113385 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:13.113319 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b8b4cb66b-bdt8v" podStartSLOduration=1.113301366 podStartE2EDuration="1.113301366s" podCreationTimestamp="2026-04-16 14:04:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:04:13.112635121 +0000 UTC m=+320.448734942" watchObservedRunningTime="2026-04-16 14:04:13.113301366 +0000 UTC m=+320.449401189" Apr 16 14:04:22.718546 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:22.718497 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:22.718546 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:22.718555 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:22.724123 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:22.724080 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:23.127803 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:23.127773 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:04:23.184048 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:23.184014 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7795cbb858-7kxnr"] Apr 16 14:04:48.209905 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.209807 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7795cbb858-7kxnr" podUID="50b6aeeb-f938-4d47-9337-607bbcc882a6" containerName="console" containerID="cri-o://47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5" gracePeriod=15 Apr 16 14:04:48.440993 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.440966 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7795cbb858-7kxnr_50b6aeeb-f938-4d47-9337-607bbcc882a6/console/0.log" Apr 16 14:04:48.441152 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.441040 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:04:48.486921 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.486842 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-serving-cert\") pod \"50b6aeeb-f938-4d47-9337-607bbcc882a6\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " Apr 16 14:04:48.486921 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.486882 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-config\") pod \"50b6aeeb-f938-4d47-9337-607bbcc882a6\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " Apr 16 14:04:48.486921 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.486918 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-trusted-ca-bundle\") pod \"50b6aeeb-f938-4d47-9337-607bbcc882a6\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " Apr 16 14:04:48.487216 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.486954 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhw4v\" (UniqueName: \"kubernetes.io/projected/50b6aeeb-f938-4d47-9337-607bbcc882a6-kube-api-access-qhw4v\") pod \"50b6aeeb-f938-4d47-9337-607bbcc882a6\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " Apr 16 14:04:48.487216 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.486973 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-service-ca\") pod \"50b6aeeb-f938-4d47-9337-607bbcc882a6\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " Apr 16 14:04:48.487216 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.487135 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-oauth-config\") pod \"50b6aeeb-f938-4d47-9337-607bbcc882a6\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " Apr 16 14:04:48.487367 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.487216 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-oauth-serving-cert\") pod \"50b6aeeb-f938-4d47-9337-607bbcc882a6\" (UID: \"50b6aeeb-f938-4d47-9337-607bbcc882a6\") " Apr 16 14:04:48.487426 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.487395 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-service-ca" (OuterVolumeSpecName: "service-ca") pod "50b6aeeb-f938-4d47-9337-607bbcc882a6" (UID: "50b6aeeb-f938-4d47-9337-607bbcc882a6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:04:48.487481 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.487421 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-config" (OuterVolumeSpecName: "console-config") pod "50b6aeeb-f938-4d47-9337-607bbcc882a6" (UID: "50b6aeeb-f938-4d47-9337-607bbcc882a6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:04:48.487481 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.487428 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "50b6aeeb-f938-4d47-9337-607bbcc882a6" (UID: "50b6aeeb-f938-4d47-9337-607bbcc882a6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:04:48.487573 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.487513 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-service-ca\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:04:48.487573 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.487531 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:04:48.487573 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.487546 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-trusted-ca-bundle\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:04:48.487752 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.487728 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "50b6aeeb-f938-4d47-9337-607bbcc882a6" (UID: "50b6aeeb-f938-4d47-9337-607bbcc882a6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:04:48.489060 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.489039 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "50b6aeeb-f938-4d47-9337-607bbcc882a6" (UID: "50b6aeeb-f938-4d47-9337-607bbcc882a6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:04:48.489178 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.489152 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "50b6aeeb-f938-4d47-9337-607bbcc882a6" (UID: "50b6aeeb-f938-4d47-9337-607bbcc882a6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:04:48.489260 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.489238 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b6aeeb-f938-4d47-9337-607bbcc882a6-kube-api-access-qhw4v" (OuterVolumeSpecName: "kube-api-access-qhw4v") pod "50b6aeeb-f938-4d47-9337-607bbcc882a6" (UID: "50b6aeeb-f938-4d47-9337-607bbcc882a6"). InnerVolumeSpecName "kube-api-access-qhw4v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:04:48.588546 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.588506 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-oauth-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:04:48.588546 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.588542 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50b6aeeb-f938-4d47-9337-607bbcc882a6-oauth-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:04:48.588546 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.588552 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50b6aeeb-f938-4d47-9337-607bbcc882a6-console-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:04:48.588780 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:48.588560 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qhw4v\" (UniqueName: \"kubernetes.io/projected/50b6aeeb-f938-4d47-9337-607bbcc882a6-kube-api-access-qhw4v\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:04:49.194214 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:49.194185 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7795cbb858-7kxnr_50b6aeeb-f938-4d47-9337-607bbcc882a6/console/0.log" Apr 16 14:04:49.194379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:49.194224 2575 generic.go:358] "Generic (PLEG): container finished" podID="50b6aeeb-f938-4d47-9337-607bbcc882a6" containerID="47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5" exitCode=2 Apr 16 14:04:49.194379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:49.194287 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7795cbb858-7kxnr" event={"ID":"50b6aeeb-f938-4d47-9337-607bbcc882a6","Type":"ContainerDied","Data":"47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5"} Apr 16 14:04:49.194379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:49.194314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7795cbb858-7kxnr" event={"ID":"50b6aeeb-f938-4d47-9337-607bbcc882a6","Type":"ContainerDied","Data":"9a22656aa3b60848403e15190e37ee23752aa009ad5882aefde5b44263a1df0b"} Apr 16 14:04:49.194379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:49.194315 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7795cbb858-7kxnr" Apr 16 14:04:49.194379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:49.194329 2575 scope.go:117] "RemoveContainer" containerID="47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5" Apr 16 14:04:49.202730 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:49.202604 2575 scope.go:117] "RemoveContainer" containerID="47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5" Apr 16 14:04:49.203007 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:04:49.202959 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5\": container with ID starting with 47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5 not found: ID does not exist" containerID="47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5" Apr 16 14:04:49.203085 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:49.203018 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5"} err="failed to get container status \"47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5\": rpc error: code = NotFound desc = could not find container \"47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5\": container with ID starting with 47033342f90b0ebb05d2ffa934780c59b059e390bcaac2c11f1c39fcfaf42de5 not found: ID does not exist" Apr 16 14:04:49.215017 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:49.214991 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7795cbb858-7kxnr"] Apr 16 14:04:49.217270 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:49.217185 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7795cbb858-7kxnr"] Apr 16 14:04:51.206009 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:04:51.205975 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b6aeeb-f938-4d47-9337-607bbcc882a6" path="/var/lib/kubelet/pods/50b6aeeb-f938-4d47-9337-607bbcc882a6/volumes" Apr 16 14:05:49.368297 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.368258 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb"] Apr 16 14:05:49.368818 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.368660 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50b6aeeb-f938-4d47-9337-607bbcc882a6" containerName="console" Apr 16 14:05:49.368818 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.368678 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b6aeeb-f938-4d47-9337-607bbcc882a6" containerName="console" Apr 16 14:05:49.368818 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.368755 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="50b6aeeb-f938-4d47-9337-607bbcc882a6" containerName="console" Apr 16 14:05:49.371866 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.371843 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" Apr 16 14:05:49.374765 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.374737 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 14:05:49.374765 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.374756 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 14:05:49.374919 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.374780 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-qtbmc\"" Apr 16 14:05:49.374919 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.374780 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 14:05:49.381611 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.381585 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb"] Apr 16 14:05:49.466601 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.466565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmzwd\" (UniqueName: \"kubernetes.io/projected/b68a07da-1219-4046-8712-62efd08c8d73-kube-api-access-bmzwd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb\" (UID: \"b68a07da-1219-4046-8712-62efd08c8d73\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" Apr 16 14:05:49.466779 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.466632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b68a07da-1219-4046-8712-62efd08c8d73-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb\" (UID: \"b68a07da-1219-4046-8712-62efd08c8d73\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" Apr 16 14:05:49.567476 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.567426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmzwd\" (UniqueName: \"kubernetes.io/projected/b68a07da-1219-4046-8712-62efd08c8d73-kube-api-access-bmzwd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb\" (UID: \"b68a07da-1219-4046-8712-62efd08c8d73\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" Apr 16 14:05:49.567649 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.567501 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b68a07da-1219-4046-8712-62efd08c8d73-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb\" (UID: \"b68a07da-1219-4046-8712-62efd08c8d73\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" Apr 16 14:05:49.569845 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.569823 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b68a07da-1219-4046-8712-62efd08c8d73-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb\" (UID: \"b68a07da-1219-4046-8712-62efd08c8d73\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" Apr 16 14:05:49.575572 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.575543 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmzwd\" (UniqueName: \"kubernetes.io/projected/b68a07da-1219-4046-8712-62efd08c8d73-kube-api-access-bmzwd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb\" (UID: \"b68a07da-1219-4046-8712-62efd08c8d73\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" Apr 16 14:05:49.682123 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.682002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" Apr 16 14:05:49.814668 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:49.814612 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb"] Apr 16 14:05:49.819262 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:05:49.819224 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb68a07da_1219_4046_8712_62efd08c8d73.slice/crio-488d83f4056a4de69e1d08b73a7da590270e8f0ba1e0294db12fd7dc6b8a5f3c WatchSource:0}: Error finding container 488d83f4056a4de69e1d08b73a7da590270e8f0ba1e0294db12fd7dc6b8a5f3c: Status 404 returned error can't find the container with id 488d83f4056a4de69e1d08b73a7da590270e8f0ba1e0294db12fd7dc6b8a5f3c Apr 16 14:05:50.362689 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:50.362651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" event={"ID":"b68a07da-1219-4046-8712-62efd08c8d73","Type":"ContainerStarted","Data":"488d83f4056a4de69e1d08b73a7da590270e8f0ba1e0294db12fd7dc6b8a5f3c"} Apr 16 14:05:53.376202 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:53.376159 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" event={"ID":"b68a07da-1219-4046-8712-62efd08c8d73","Type":"ContainerStarted","Data":"64381366527a5c94a41890fe27be1d60c48dd2ee8e5fb74d22540244214d4306"} Apr 16 14:05:53.376594 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:53.376290 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" Apr 16 14:05:53.395555 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:05:53.395493 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" podStartSLOduration=0.954563021 podStartE2EDuration="4.395478005s" podCreationTimestamp="2026-04-16 14:05:49 +0000 UTC" firstStartedPulling="2026-04-16 14:05:49.821407086 +0000 UTC m=+417.157506885" lastFinishedPulling="2026-04-16 14:05:53.262322055 +0000 UTC m=+420.598421869" observedRunningTime="2026-04-16 14:05:53.394413347 +0000 UTC m=+420.730513171" watchObservedRunningTime="2026-04-16 14:05:53.395478005 +0000 UTC m=+420.731577826" Apr 16 14:06:14.381225 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:06:14.381139 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zx5bb" Apr 16 14:07:00.593856 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.593825 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-9dpvt"] Apr 16 14:07:00.596773 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.596754 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:00.599817 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.599786 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:07:00.600247 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.599859 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:07:00.600247 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.599897 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-mrk4v\"" Apr 16 14:07:00.600247 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.599912 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 14:07:00.600398 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.600381 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl"] Apr 16 14:07:00.607303 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.607277 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" Apr 16 14:07:00.609312 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.609290 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 14:07:00.609773 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.609751 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-ncxxx\"" Apr 16 14:07:00.610399 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.610372 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-9dpvt"] Apr 16 14:07:00.613933 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.613910 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl"] Apr 16 14:07:00.726338 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.726296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-pb5xl\" (UID: \"4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" Apr 16 14:07:00.726519 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.726367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-cert\") pod \"kserve-controller-manager-75d667c7c4-9dpvt\" (UID: \"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7\") " pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:00.726519 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.726412 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdd8\" (UniqueName: \"kubernetes.io/projected/4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d-kube-api-access-vjdd8\") pod \"llmisvc-controller-manager-68cc5db7c4-pb5xl\" (UID: \"4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" Apr 16 14:07:00.726519 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.726457 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz8gl\" (UniqueName: \"kubernetes.io/projected/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-kube-api-access-fz8gl\") pod \"kserve-controller-manager-75d667c7c4-9dpvt\" (UID: \"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7\") " pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:00.827108 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.827056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-cert\") pod \"kserve-controller-manager-75d667c7c4-9dpvt\" (UID: \"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7\") " pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:00.827257 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.827115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdd8\" (UniqueName: \"kubernetes.io/projected/4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d-kube-api-access-vjdd8\") pod \"llmisvc-controller-manager-68cc5db7c4-pb5xl\" (UID: \"4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" Apr 16 14:07:00.827257 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.827138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fz8gl\" (UniqueName: \"kubernetes.io/projected/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-kube-api-access-fz8gl\") pod \"kserve-controller-manager-75d667c7c4-9dpvt\" (UID: \"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7\") " pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:00.827257 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.827180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-pb5xl\" (UID: \"4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" Apr 16 14:07:00.829749 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.829721 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-pb5xl\" (UID: \"4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" Apr 16 14:07:00.829862 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.829721 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-cert\") pod \"kserve-controller-manager-75d667c7c4-9dpvt\" (UID: \"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7\") " pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:00.836864 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.836839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdd8\" (UniqueName: \"kubernetes.io/projected/4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d-kube-api-access-vjdd8\") pod \"llmisvc-controller-manager-68cc5db7c4-pb5xl\" (UID: \"4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" Apr 16 14:07:00.836864 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.836854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz8gl\" (UniqueName: \"kubernetes.io/projected/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-kube-api-access-fz8gl\") pod \"kserve-controller-manager-75d667c7c4-9dpvt\" (UID: \"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7\") " pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:00.916491 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.916456 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:00.924250 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:00.924223 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" Apr 16 14:07:01.041977 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:01.041941 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-9dpvt"] Apr 16 14:07:01.045632 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:07:01.045601 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cdb5b3f_ade5_4a96_b724_1cc0ea66d3d7.slice/crio-08bea327b75d247bcb2ea8bc284879e8d15495bb1d7999d04bc3b198a92a9d0d WatchSource:0}: Error finding container 08bea327b75d247bcb2ea8bc284879e8d15495bb1d7999d04bc3b198a92a9d0d: Status 404 returned error can't find the container with id 08bea327b75d247bcb2ea8bc284879e8d15495bb1d7999d04bc3b198a92a9d0d Apr 16 14:07:01.066263 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:01.066238 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl"] Apr 16 14:07:01.068674 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:07:01.068649 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4de4d1dc_3ce1_4bd7_aefc_4216fdb5313d.slice/crio-a2e7cf79a02f0693f76c5533071173047dcc505543000f4143fc7077325714d4 WatchSource:0}: Error finding container a2e7cf79a02f0693f76c5533071173047dcc505543000f4143fc7077325714d4: Status 404 returned error can't find the container with id a2e7cf79a02f0693f76c5533071173047dcc505543000f4143fc7077325714d4 Apr 16 14:07:01.559168 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:01.559131 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" event={"ID":"4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d","Type":"ContainerStarted","Data":"a2e7cf79a02f0693f76c5533071173047dcc505543000f4143fc7077325714d4"} Apr 16 14:07:01.560026 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:01.560003 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" event={"ID":"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7","Type":"ContainerStarted","Data":"08bea327b75d247bcb2ea8bc284879e8d15495bb1d7999d04bc3b198a92a9d0d"} Apr 16 14:07:05.575055 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:05.575017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" event={"ID":"4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d","Type":"ContainerStarted","Data":"65c636211f776b9276f3792bc1663ad849ffceacde4752eba02826b6120dda34"} Apr 16 14:07:05.575495 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:05.575118 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" Apr 16 14:07:05.576331 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:05.576310 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" event={"ID":"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7","Type":"ContainerStarted","Data":"22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d"} Apr 16 14:07:05.576425 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:05.576406 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:05.591304 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:05.591265 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" podStartSLOduration=2.175358562 podStartE2EDuration="5.591254612s" podCreationTimestamp="2026-04-16 14:07:00 +0000 UTC" firstStartedPulling="2026-04-16 14:07:01.069903861 +0000 UTC m=+488.406003660" lastFinishedPulling="2026-04-16 14:07:04.485799905 +0000 UTC m=+491.821899710" observedRunningTime="2026-04-16 14:07:05.590230729 +0000 UTC m=+492.926330553" watchObservedRunningTime="2026-04-16 14:07:05.591254612 +0000 UTC m=+492.927354434" Apr 16 14:07:05.605900 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:05.605860 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" podStartSLOduration=2.130237542 podStartE2EDuration="5.605847289s" podCreationTimestamp="2026-04-16 14:07:00 +0000 UTC" firstStartedPulling="2026-04-16 14:07:01.046998398 +0000 UTC m=+488.383098198" lastFinishedPulling="2026-04-16 14:07:04.52260814 +0000 UTC m=+491.858707945" observedRunningTime="2026-04-16 14:07:05.604956469 +0000 UTC m=+492.941056292" watchObservedRunningTime="2026-04-16 14:07:05.605847289 +0000 UTC m=+492.941947110" Apr 16 14:07:36.580782 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:36.580749 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-pb5xl" Apr 16 14:07:36.583802 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:36.583781 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:37.952699 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:37.952658 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-9dpvt"] Apr 16 14:07:37.953132 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:37.952858 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" podUID="2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7" containerName="manager" containerID="cri-o://22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d" gracePeriod=10 Apr 16 14:07:37.979293 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:37.979267 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-knc9v"] Apr 16 14:07:37.982437 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:37.982418 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" Apr 16 14:07:37.989275 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:37.989240 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-knc9v"] Apr 16 14:07:38.029329 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.029299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctkl4\" (UniqueName: \"kubernetes.io/projected/0aac9d5c-cbae-44f7-85d0-f73a2334d3ce-kube-api-access-ctkl4\") pod \"kserve-controller-manager-75d667c7c4-knc9v\" (UID: \"0aac9d5c-cbae-44f7-85d0-f73a2334d3ce\") " pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" Apr 16 14:07:38.029448 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.029349 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0aac9d5c-cbae-44f7-85d0-f73a2334d3ce-cert\") pod \"kserve-controller-manager-75d667c7c4-knc9v\" (UID: \"0aac9d5c-cbae-44f7-85d0-f73a2334d3ce\") " pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" Apr 16 14:07:38.129871 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.129830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkl4\" (UniqueName: \"kubernetes.io/projected/0aac9d5c-cbae-44f7-85d0-f73a2334d3ce-kube-api-access-ctkl4\") pod \"kserve-controller-manager-75d667c7c4-knc9v\" (UID: \"0aac9d5c-cbae-44f7-85d0-f73a2334d3ce\") " pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" Apr 16 14:07:38.130001 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.129927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0aac9d5c-cbae-44f7-85d0-f73a2334d3ce-cert\") pod \"kserve-controller-manager-75d667c7c4-knc9v\" (UID: \"0aac9d5c-cbae-44f7-85d0-f73a2334d3ce\") " pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" Apr 16 14:07:38.132652 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.132625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0aac9d5c-cbae-44f7-85d0-f73a2334d3ce-cert\") pod \"kserve-controller-manager-75d667c7c4-knc9v\" (UID: \"0aac9d5c-cbae-44f7-85d0-f73a2334d3ce\") " pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" Apr 16 14:07:38.138427 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.138400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctkl4\" (UniqueName: \"kubernetes.io/projected/0aac9d5c-cbae-44f7-85d0-f73a2334d3ce-kube-api-access-ctkl4\") pod \"kserve-controller-manager-75d667c7c4-knc9v\" (UID: \"0aac9d5c-cbae-44f7-85d0-f73a2334d3ce\") " pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" Apr 16 14:07:38.185889 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.185868 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:38.230472 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.230385 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz8gl\" (UniqueName: \"kubernetes.io/projected/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-kube-api-access-fz8gl\") pod \"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7\" (UID: \"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7\") " Apr 16 14:07:38.230472 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.230450 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-cert\") pod \"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7\" (UID: \"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7\") " Apr 16 14:07:38.232437 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.232411 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-kube-api-access-fz8gl" (OuterVolumeSpecName: "kube-api-access-fz8gl") pod "2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7" (UID: "2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7"). InnerVolumeSpecName "kube-api-access-fz8gl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:07:38.232607 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.232594 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-cert" (OuterVolumeSpecName: "cert") pod "2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7" (UID: "2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:07:38.331000 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.330968 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fz8gl\" (UniqueName: \"kubernetes.io/projected/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-kube-api-access-fz8gl\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:07:38.331000 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.330997 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:07:38.333924 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.333900 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" Apr 16 14:07:38.448154 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.447983 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-knc9v"] Apr 16 14:07:38.450757 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:07:38.450731 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aac9d5c_cbae_44f7_85d0_f73a2334d3ce.slice/crio-8af03ff24d1a599461acf8f3f90465365429b86bd6878aaeae770218b38eaf1b WatchSource:0}: Error finding container 8af03ff24d1a599461acf8f3f90465365429b86bd6878aaeae770218b38eaf1b: Status 404 returned error can't find the container with id 8af03ff24d1a599461acf8f3f90465365429b86bd6878aaeae770218b38eaf1b Apr 16 14:07:38.674739 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.674700 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" event={"ID":"0aac9d5c-cbae-44f7-85d0-f73a2334d3ce","Type":"ContainerStarted","Data":"8af03ff24d1a599461acf8f3f90465365429b86bd6878aaeae770218b38eaf1b"} Apr 16 14:07:38.675821 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.675793 2575 generic.go:358] "Generic (PLEG): container finished" podID="2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7" containerID="22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d" exitCode=0 Apr 16 14:07:38.676367 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.675830 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" event={"ID":"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7","Type":"ContainerDied","Data":"22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d"} Apr 16 14:07:38.676367 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.675853 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" event={"ID":"2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7","Type":"ContainerDied","Data":"08bea327b75d247bcb2ea8bc284879e8d15495bb1d7999d04bc3b198a92a9d0d"} Apr 16 14:07:38.676367 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.675860 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-9dpvt" Apr 16 14:07:38.676367 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.675867 2575 scope.go:117] "RemoveContainer" containerID="22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d" Apr 16 14:07:38.683718 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.683694 2575 scope.go:117] "RemoveContainer" containerID="22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d" Apr 16 14:07:38.684002 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:07:38.683974 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d\": container with ID starting with 22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d not found: ID does not exist" containerID="22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d" Apr 16 14:07:38.684056 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.684014 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d"} err="failed to get container status \"22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d\": rpc error: code = NotFound desc = could not find container \"22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d\": container with ID starting with 22c38b29c2d29e2345c629b0f5fd9a2657149657f84205585221d3ac0107106d not found: ID does not exist" Apr 16 14:07:38.695600 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.695576 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-9dpvt"] Apr 16 14:07:38.698665 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:38.698644 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-9dpvt"] Apr 16 14:07:39.210747 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:39.207537 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7" path="/var/lib/kubelet/pods/2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7/volumes" Apr 16 14:07:40.684089 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:40.683999 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" event={"ID":"0aac9d5c-cbae-44f7-85d0-f73a2334d3ce","Type":"ContainerStarted","Data":"e84675bb25e4acf52e1212ea76374250db2b278282bdb608ef7081061e94d032"} Apr 16 14:07:40.684480 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:40.684130 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" Apr 16 14:07:40.700811 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:07:40.700765 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" podStartSLOduration=1.749743775 podStartE2EDuration="3.7007518s" podCreationTimestamp="2026-04-16 14:07:37 +0000 UTC" firstStartedPulling="2026-04-16 14:07:38.451918319 +0000 UTC m=+525.788018118" lastFinishedPulling="2026-04-16 14:07:40.402926335 +0000 UTC m=+527.739026143" observedRunningTime="2026-04-16 14:07:40.699262216 +0000 UTC m=+528.035362038" watchObservedRunningTime="2026-04-16 14:07:40.7007518 +0000 UTC m=+528.036851647" Apr 16 14:08:08.315052 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.315020 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55848895b-d8sfc"] Apr 16 14:08:08.315522 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.315345 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7" containerName="manager" Apr 16 14:08:08.315522 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.315360 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7" containerName="manager" Apr 16 14:08:08.315522 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.315416 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2cdb5b3f-ade5-4a96-b724-1cc0ea66d3d7" containerName="manager" Apr 16 14:08:08.319044 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.319024 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.330026 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.330004 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55848895b-d8sfc"] Apr 16 14:08:08.360398 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.360373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a7dd1-3065-42dd-af46-f0db44444e20-console-serving-cert\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.360525 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.360407 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-service-ca\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.360525 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.360435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn996\" (UniqueName: \"kubernetes.io/projected/8e2a7dd1-3065-42dd-af46-f0db44444e20-kube-api-access-bn996\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.360623 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.360538 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e2a7dd1-3065-42dd-af46-f0db44444e20-console-oauth-config\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.360623 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.360573 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-oauth-serving-cert\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.360623 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.360591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-console-config\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.360731 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.360656 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-trusted-ca-bundle\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.461975 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.461941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a7dd1-3065-42dd-af46-f0db44444e20-console-serving-cert\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.461975 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.461976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-service-ca\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.462261 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.461995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn996\" (UniqueName: \"kubernetes.io/projected/8e2a7dd1-3065-42dd-af46-f0db44444e20-kube-api-access-bn996\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.462261 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.462035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e2a7dd1-3065-42dd-af46-f0db44444e20-console-oauth-config\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.462261 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.462053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-oauth-serving-cert\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.462261 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.462071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-console-config\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.462261 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.462120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-trusted-ca-bundle\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.462820 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.462792 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-service-ca\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.462914 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.462792 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-oauth-serving-cert\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.462914 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.462885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-console-config\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.462996 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.462984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e2a7dd1-3065-42dd-af46-f0db44444e20-trusted-ca-bundle\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.464424 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.464400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e2a7dd1-3065-42dd-af46-f0db44444e20-console-oauth-config\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.464571 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.464556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a7dd1-3065-42dd-af46-f0db44444e20-console-serving-cert\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.469457 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.469434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn996\" (UniqueName: \"kubernetes.io/projected/8e2a7dd1-3065-42dd-af46-f0db44444e20-kube-api-access-bn996\") pod \"console-55848895b-d8sfc\" (UID: \"8e2a7dd1-3065-42dd-af46-f0db44444e20\") " pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.628045 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.627991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:08.747592 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.747568 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55848895b-d8sfc"] Apr 16 14:08:08.749441 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:08:08.749410 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2a7dd1_3065_42dd_af46_f0db44444e20.slice/crio-128cf74f3d9fde138415692274b4f0348e1b7c5a5d1872a5dd0fc33d4bc7ddda WatchSource:0}: Error finding container 128cf74f3d9fde138415692274b4f0348e1b7c5a5d1872a5dd0fc33d4bc7ddda: Status 404 returned error can't find the container with id 128cf74f3d9fde138415692274b4f0348e1b7c5a5d1872a5dd0fc33d4bc7ddda Apr 16 14:08:08.769030 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:08.769006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55848895b-d8sfc" event={"ID":"8e2a7dd1-3065-42dd-af46-f0db44444e20","Type":"ContainerStarted","Data":"128cf74f3d9fde138415692274b4f0348e1b7c5a5d1872a5dd0fc33d4bc7ddda"} Apr 16 14:08:09.773196 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:09.773163 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55848895b-d8sfc" event={"ID":"8e2a7dd1-3065-42dd-af46-f0db44444e20","Type":"ContainerStarted","Data":"c6775de9365b1d51ece63591df84e33d687424ecca0f3e178779da162288eaeb"} Apr 16 14:08:09.790186 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:09.790126 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55848895b-d8sfc" podStartSLOduration=1.7900897439999999 podStartE2EDuration="1.790089744s" podCreationTimestamp="2026-04-16 14:08:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:08:09.788429647 +0000 UTC m=+557.124529480" watchObservedRunningTime="2026-04-16 14:08:09.790089744 +0000 UTC m=+557.126189565" Apr 16 14:08:11.692411 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:11.692374 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-75d667c7c4-knc9v" Apr 16 14:08:18.629136 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:18.629079 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:18.629608 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:18.629161 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:18.633866 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:18.633846 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:18.809286 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:18.809254 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55848895b-d8sfc" Apr 16 14:08:18.854682 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:18.854654 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b8b4cb66b-bdt8v"] Apr 16 14:08:43.875780 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:43.875740 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b8b4cb66b-bdt8v" podUID="47dfbf16-5bbb-4abc-a41c-eafb4724d50a" containerName="console" containerID="cri-o://a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98" gracePeriod=15 Apr 16 14:08:44.116588 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.116568 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b8b4cb66b-bdt8v_47dfbf16-5bbb-4abc-a41c-eafb4724d50a/console/0.log" Apr 16 14:08:44.116704 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.116631 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:08:44.148596 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.148523 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7k4z\" (UniqueName: \"kubernetes.io/projected/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-kube-api-access-f7k4z\") pod \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " Apr 16 14:08:44.148596 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.148573 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-oauth-serving-cert\") pod \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " Apr 16 14:08:44.148790 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.148600 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-trusted-ca-bundle\") pod \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " Apr 16 14:08:44.148790 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.148619 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-service-ca\") pod \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " Apr 16 14:08:44.148790 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.148639 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-oauth-config\") pod \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " Apr 16 14:08:44.148790 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.148678 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-serving-cert\") pod \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " Apr 16 14:08:44.148790 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.148701 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-config\") pod \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\" (UID: \"47dfbf16-5bbb-4abc-a41c-eafb4724d50a\") " Apr 16 14:08:44.149232 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.149201 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "47dfbf16-5bbb-4abc-a41c-eafb4724d50a" (UID: "47dfbf16-5bbb-4abc-a41c-eafb4724d50a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:08:44.149371 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.149312 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "47dfbf16-5bbb-4abc-a41c-eafb4724d50a" (UID: "47dfbf16-5bbb-4abc-a41c-eafb4724d50a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:08:44.149457 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.149400 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-config" (OuterVolumeSpecName: "console-config") pod "47dfbf16-5bbb-4abc-a41c-eafb4724d50a" (UID: "47dfbf16-5bbb-4abc-a41c-eafb4724d50a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:08:44.149517 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.149470 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-service-ca" (OuterVolumeSpecName: "service-ca") pod "47dfbf16-5bbb-4abc-a41c-eafb4724d50a" (UID: "47dfbf16-5bbb-4abc-a41c-eafb4724d50a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:08:44.150731 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.150698 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "47dfbf16-5bbb-4abc-a41c-eafb4724d50a" (UID: "47dfbf16-5bbb-4abc-a41c-eafb4724d50a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:08:44.150731 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.150717 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-kube-api-access-f7k4z" (OuterVolumeSpecName: "kube-api-access-f7k4z") pod "47dfbf16-5bbb-4abc-a41c-eafb4724d50a" (UID: "47dfbf16-5bbb-4abc-a41c-eafb4724d50a"). InnerVolumeSpecName "kube-api-access-f7k4z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:08:44.150897 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.150802 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "47dfbf16-5bbb-4abc-a41c-eafb4724d50a" (UID: "47dfbf16-5bbb-4abc-a41c-eafb4724d50a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:08:44.250039 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.250010 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:08:44.250039 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.250037 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:08:44.250237 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.250047 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f7k4z\" (UniqueName: \"kubernetes.io/projected/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-kube-api-access-f7k4z\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:08:44.250237 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.250057 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-oauth-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:08:44.250237 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.250066 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-trusted-ca-bundle\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:08:44.250237 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.250075 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-service-ca\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:08:44.250237 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.250084 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/47dfbf16-5bbb-4abc-a41c-eafb4724d50a-console-oauth-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:08:44.891966 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.891939 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b8b4cb66b-bdt8v_47dfbf16-5bbb-4abc-a41c-eafb4724d50a/console/0.log" Apr 16 14:08:44.892379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.891989 2575 generic.go:358] "Generic (PLEG): container finished" podID="47dfbf16-5bbb-4abc-a41c-eafb4724d50a" containerID="a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98" exitCode=2 Apr 16 14:08:44.892379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.892048 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8b4cb66b-bdt8v" event={"ID":"47dfbf16-5bbb-4abc-a41c-eafb4724d50a","Type":"ContainerDied","Data":"a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98"} Apr 16 14:08:44.892379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.892072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8b4cb66b-bdt8v" event={"ID":"47dfbf16-5bbb-4abc-a41c-eafb4724d50a","Type":"ContainerDied","Data":"022b09f7710683aeb47e10eec6d69251514d403a75fa595508fb6e4c6b40bd78"} Apr 16 14:08:44.892379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.892078 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8b4cb66b-bdt8v" Apr 16 14:08:44.892379 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.892085 2575 scope.go:117] "RemoveContainer" containerID="a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98" Apr 16 14:08:44.900153 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.900136 2575 scope.go:117] "RemoveContainer" containerID="a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98" Apr 16 14:08:44.900386 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:08:44.900368 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98\": container with ID starting with a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98 not found: ID does not exist" containerID="a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98" Apr 16 14:08:44.900432 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.900394 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98"} err="failed to get container status \"a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98\": rpc error: code = NotFound desc = could not find container \"a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98\": container with ID starting with a811a7a50326af29e603550a421ba676be79fe0005be99aa753dd40beb71aa98 not found: ID does not exist" Apr 16 14:08:44.911340 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.911317 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b8b4cb66b-bdt8v"] Apr 16 14:08:44.916654 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:44.916634 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b8b4cb66b-bdt8v"] Apr 16 14:08:45.206225 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:45.206140 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47dfbf16-5bbb-4abc-a41c-eafb4724d50a" path="/var/lib/kubelet/pods/47dfbf16-5bbb-4abc-a41c-eafb4724d50a/volumes" Apr 16 14:08:53.108419 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:53.108391 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 14:08:53.108853 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:08:53.108440 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 14:11:54.937039 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.937007 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql"] Apr 16 14:11:54.937541 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.937329 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47dfbf16-5bbb-4abc-a41c-eafb4724d50a" containerName="console" Apr 16 14:11:54.937541 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.937341 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="47dfbf16-5bbb-4abc-a41c-eafb4724d50a" containerName="console" Apr 16 14:11:54.937541 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.937387 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="47dfbf16-5bbb-4abc-a41c-eafb4724d50a" containerName="console" Apr 16 14:11:54.942899 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.942879 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:11:54.945265 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.945236 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-3af68-serving-cert\"" Apr 16 14:11:54.945446 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.945427 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:11:54.945527 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.945500 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-x6x2v\"" Apr 16 14:11:54.945588 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.945525 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-3af68-kube-rbac-proxy-sar-config\"" Apr 16 14:11:54.949205 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.949182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec026471-7a50-4527-bade-8ca250ab1679-proxy-tls\") pod \"model-chainer-raw-3af68-fffcc59fc-72kql\" (UID: \"ec026471-7a50-4527-bade-8ca250ab1679\") " pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:11:54.949322 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.949242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec026471-7a50-4527-bade-8ca250ab1679-openshift-service-ca-bundle\") pod \"model-chainer-raw-3af68-fffcc59fc-72kql\" (UID: \"ec026471-7a50-4527-bade-8ca250ab1679\") " pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:11:54.951410 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:54.951385 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql"] Apr 16 14:11:55.050023 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:55.049992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec026471-7a50-4527-bade-8ca250ab1679-openshift-service-ca-bundle\") pod \"model-chainer-raw-3af68-fffcc59fc-72kql\" (UID: \"ec026471-7a50-4527-bade-8ca250ab1679\") " pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:11:55.050207 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:55.050056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec026471-7a50-4527-bade-8ca250ab1679-proxy-tls\") pod \"model-chainer-raw-3af68-fffcc59fc-72kql\" (UID: \"ec026471-7a50-4527-bade-8ca250ab1679\") " pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:11:55.050207 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:11:55.050161 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-3af68-serving-cert: secret "model-chainer-raw-3af68-serving-cert" not found Apr 16 14:11:55.050292 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:11:55.050220 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec026471-7a50-4527-bade-8ca250ab1679-proxy-tls podName:ec026471-7a50-4527-bade-8ca250ab1679 nodeName:}" failed. No retries permitted until 2026-04-16 14:11:55.550205014 +0000 UTC m=+782.886304814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ec026471-7a50-4527-bade-8ca250ab1679-proxy-tls") pod "model-chainer-raw-3af68-fffcc59fc-72kql" (UID: "ec026471-7a50-4527-bade-8ca250ab1679") : secret "model-chainer-raw-3af68-serving-cert" not found Apr 16 14:11:55.050639 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:55.050619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec026471-7a50-4527-bade-8ca250ab1679-openshift-service-ca-bundle\") pod \"model-chainer-raw-3af68-fffcc59fc-72kql\" (UID: \"ec026471-7a50-4527-bade-8ca250ab1679\") " pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:11:55.555423 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:55.555367 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec026471-7a50-4527-bade-8ca250ab1679-proxy-tls\") pod \"model-chainer-raw-3af68-fffcc59fc-72kql\" (UID: \"ec026471-7a50-4527-bade-8ca250ab1679\") " pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:11:55.557727 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:55.557704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec026471-7a50-4527-bade-8ca250ab1679-proxy-tls\") pod \"model-chainer-raw-3af68-fffcc59fc-72kql\" (UID: \"ec026471-7a50-4527-bade-8ca250ab1679\") " pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:11:55.853291 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:55.853246 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:11:55.972545 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:55.972507 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql"] Apr 16 14:11:55.974836 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:11:55.974808 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec026471_7a50_4527_bade_8ca250ab1679.slice/crio-c3ec9311fae1f8f7224f4a74f557f90acca1f610d71937d395165a7e5acdfd31 WatchSource:0}: Error finding container c3ec9311fae1f8f7224f4a74f557f90acca1f610d71937d395165a7e5acdfd31: Status 404 returned error can't find the container with id c3ec9311fae1f8f7224f4a74f557f90acca1f610d71937d395165a7e5acdfd31 Apr 16 14:11:55.977124 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:55.977108 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:11:56.469624 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:56.469586 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" event={"ID":"ec026471-7a50-4527-bade-8ca250ab1679","Type":"ContainerStarted","Data":"c3ec9311fae1f8f7224f4a74f557f90acca1f610d71937d395165a7e5acdfd31"} Apr 16 14:11:58.476922 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:58.476834 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" event={"ID":"ec026471-7a50-4527-bade-8ca250ab1679","Type":"ContainerStarted","Data":"59bde9684e7293ad69112ef6bbc0b00af532370454c48841cb29c047f01d6f09"} Apr 16 14:11:58.476922 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:58.476885 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:11:58.493580 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:11:58.493532 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" podStartSLOduration=2.285597264 podStartE2EDuration="4.493518643s" podCreationTimestamp="2026-04-16 14:11:54 +0000 UTC" firstStartedPulling="2026-04-16 14:11:55.977229594 +0000 UTC m=+783.313329395" lastFinishedPulling="2026-04-16 14:11:58.18515097 +0000 UTC m=+785.521250774" observedRunningTime="2026-04-16 14:11:58.491506547 +0000 UTC m=+785.827606369" watchObservedRunningTime="2026-04-16 14:11:58.493518643 +0000 UTC m=+785.829618465" Apr 16 14:12:04.485577 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:04.485549 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:12:04.979086 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:04.979056 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql"] Apr 16 14:12:04.979291 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:04.979267 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" podUID="ec026471-7a50-4527-bade-8ca250ab1679" containerName="model-chainer-raw-3af68" containerID="cri-o://59bde9684e7293ad69112ef6bbc0b00af532370454c48841cb29c047f01d6f09" gracePeriod=30 Apr 16 14:12:09.484317 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:09.484276 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" podUID="ec026471-7a50-4527-bade-8ca250ab1679" containerName="model-chainer-raw-3af68" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:14.484884 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:14.484788 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" podUID="ec026471-7a50-4527-bade-8ca250ab1679" containerName="model-chainer-raw-3af68" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:19.485609 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:19.485562 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" podUID="ec026471-7a50-4527-bade-8ca250ab1679" containerName="model-chainer-raw-3af68" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:19.486009 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:19.485662 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:12:24.483845 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:24.483804 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" podUID="ec026471-7a50-4527-bade-8ca250ab1679" containerName="model-chainer-raw-3af68" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:29.484135 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:29.484077 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" podUID="ec026471-7a50-4527-bade-8ca250ab1679" containerName="model-chainer-raw-3af68" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:34.484490 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:34.484453 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" podUID="ec026471-7a50-4527-bade-8ca250ab1679" containerName="model-chainer-raw-3af68" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:35.593189 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:35.593159 2575 generic.go:358] "Generic (PLEG): container finished" podID="ec026471-7a50-4527-bade-8ca250ab1679" containerID="59bde9684e7293ad69112ef6bbc0b00af532370454c48841cb29c047f01d6f09" exitCode=0 Apr 16 14:12:35.593535 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:35.593212 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" event={"ID":"ec026471-7a50-4527-bade-8ca250ab1679","Type":"ContainerDied","Data":"59bde9684e7293ad69112ef6bbc0b00af532370454c48841cb29c047f01d6f09"} Apr 16 14:12:35.618387 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:35.618364 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:12:35.782428 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:35.782346 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec026471-7a50-4527-bade-8ca250ab1679-proxy-tls\") pod \"ec026471-7a50-4527-bade-8ca250ab1679\" (UID: \"ec026471-7a50-4527-bade-8ca250ab1679\") " Apr 16 14:12:35.782428 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:35.782398 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec026471-7a50-4527-bade-8ca250ab1679-openshift-service-ca-bundle\") pod \"ec026471-7a50-4527-bade-8ca250ab1679\" (UID: \"ec026471-7a50-4527-bade-8ca250ab1679\") " Apr 16 14:12:35.782762 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:35.782741 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec026471-7a50-4527-bade-8ca250ab1679-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "ec026471-7a50-4527-bade-8ca250ab1679" (UID: "ec026471-7a50-4527-bade-8ca250ab1679"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:12:35.784378 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:35.784355 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec026471-7a50-4527-bade-8ca250ab1679-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ec026471-7a50-4527-bade-8ca250ab1679" (UID: "ec026471-7a50-4527-bade-8ca250ab1679"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:12:35.883460 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:35.883423 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec026471-7a50-4527-bade-8ca250ab1679-proxy-tls\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:12:35.883460 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:35.883457 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec026471-7a50-4527-bade-8ca250ab1679-openshift-service-ca-bundle\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:12:36.596884 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:36.596845 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" event={"ID":"ec026471-7a50-4527-bade-8ca250ab1679","Type":"ContainerDied","Data":"c3ec9311fae1f8f7224f4a74f557f90acca1f610d71937d395165a7e5acdfd31"} Apr 16 14:12:36.596884 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:36.596868 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql" Apr 16 14:12:36.596884 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:36.596890 2575 scope.go:117] "RemoveContainer" containerID="59bde9684e7293ad69112ef6bbc0b00af532370454c48841cb29c047f01d6f09" Apr 16 14:12:36.616787 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:36.616763 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql"] Apr 16 14:12:36.619771 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:36.619750 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3af68-fffcc59fc-72kql"] Apr 16 14:12:37.207153 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:12:37.207118 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec026471-7a50-4527-bade-8ca250ab1679" path="/var/lib/kubelet/pods/ec026471-7a50-4527-bade-8ca250ab1679/volumes" Apr 16 14:13:35.212707 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.212674 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m"] Apr 16 14:13:35.213187 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.212940 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec026471-7a50-4527-bade-8ca250ab1679" containerName="model-chainer-raw-3af68" Apr 16 14:13:35.213187 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.212951 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec026471-7a50-4527-bade-8ca250ab1679" containerName="model-chainer-raw-3af68" Apr 16 14:13:35.213187 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.213002 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec026471-7a50-4527-bade-8ca250ab1679" containerName="model-chainer-raw-3af68" Apr 16 14:13:35.214574 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.214558 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:13:35.216988 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.216955 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-3c9de-kube-rbac-proxy-sar-config\"" Apr 16 14:13:35.216988 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.216968 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:13:35.217180 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.216956 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-3c9de-serving-cert\"" Apr 16 14:13:35.217180 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.217052 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-x6x2v\"" Apr 16 14:13:35.220927 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.220889 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m"] Apr 16 14:13:35.337854 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.337818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf44163-4949-45b9-aa49-f266586451eb-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m\" (UID: \"faf44163-4949-45b9-aa49-f266586451eb\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:13:35.338031 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.337875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/faf44163-4949-45b9-aa49-f266586451eb-proxy-tls\") pod \"model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m\" (UID: \"faf44163-4949-45b9-aa49-f266586451eb\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:13:35.439145 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.439082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf44163-4949-45b9-aa49-f266586451eb-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m\" (UID: \"faf44163-4949-45b9-aa49-f266586451eb\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:13:35.439301 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.439182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/faf44163-4949-45b9-aa49-f266586451eb-proxy-tls\") pod \"model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m\" (UID: \"faf44163-4949-45b9-aa49-f266586451eb\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:13:35.439719 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.439692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf44163-4949-45b9-aa49-f266586451eb-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m\" (UID: \"faf44163-4949-45b9-aa49-f266586451eb\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:13:35.441526 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.441508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/faf44163-4949-45b9-aa49-f266586451eb-proxy-tls\") pod \"model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m\" (UID: \"faf44163-4949-45b9-aa49-f266586451eb\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:13:35.526934 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.526832 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:13:35.642136 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.642106 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m"] Apr 16 14:13:35.645983 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:13:35.645954 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf44163_4949_45b9_aa49_f266586451eb.slice/crio-1aa87b024d4f8b00d009e1c52ec1924b1bc7e7f51e8b49ffc3a87b9382e5ef01 WatchSource:0}: Error finding container 1aa87b024d4f8b00d009e1c52ec1924b1bc7e7f51e8b49ffc3a87b9382e5ef01: Status 404 returned error can't find the container with id 1aa87b024d4f8b00d009e1c52ec1924b1bc7e7f51e8b49ffc3a87b9382e5ef01 Apr 16 14:13:35.775962 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.775930 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" event={"ID":"faf44163-4949-45b9-aa49-f266586451eb","Type":"ContainerStarted","Data":"7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb"} Apr 16 14:13:35.775962 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.775967 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" event={"ID":"faf44163-4949-45b9-aa49-f266586451eb","Type":"ContainerStarted","Data":"1aa87b024d4f8b00d009e1c52ec1924b1bc7e7f51e8b49ffc3a87b9382e5ef01"} Apr 16 14:13:35.776184 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.775993 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:13:35.791790 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:35.791702 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" podStartSLOduration=0.791689668 podStartE2EDuration="791.689668ms" podCreationTimestamp="2026-04-16 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:13:35.790206495 +0000 UTC m=+883.126306316" watchObservedRunningTime="2026-04-16 14:13:35.791689668 +0000 UTC m=+883.127789490" Apr 16 14:13:41.785002 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:41.784969 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:13:45.270787 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:45.270699 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m"] Apr 16 14:13:45.271195 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:45.270957 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" podUID="faf44163-4949-45b9-aa49-f266586451eb" containerName="model-chainer-raw-hpa-3c9de" containerID="cri-o://7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb" gracePeriod=30 Apr 16 14:13:45.430212 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:45.430174 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2"] Apr 16 14:13:45.432394 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:45.432375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" Apr 16 14:13:45.440554 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:45.440120 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2"] Apr 16 14:13:45.443367 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:45.443348 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" Apr 16 14:13:45.576701 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:45.576673 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2"] Apr 16 14:13:45.578982 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:13:45.578957 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c93375_a90d_4947_88f3_a086a4c482f7.slice/crio-6f7612c6b7e85bd39c603b522d7d4a3284f59bcf3b472027bd2dac99826abbc1 WatchSource:0}: Error finding container 6f7612c6b7e85bd39c603b522d7d4a3284f59bcf3b472027bd2dac99826abbc1: Status 404 returned error can't find the container with id 6f7612c6b7e85bd39c603b522d7d4a3284f59bcf3b472027bd2dac99826abbc1 Apr 16 14:13:45.807732 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:45.807644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" event={"ID":"41c93375-a90d-4947-88f3-a086a4c482f7","Type":"ContainerStarted","Data":"6f7612c6b7e85bd39c603b522d7d4a3284f59bcf3b472027bd2dac99826abbc1"} Apr 16 14:13:46.783233 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:46.783192 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" podUID="faf44163-4949-45b9-aa49-f266586451eb" containerName="model-chainer-raw-hpa-3c9de" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:13:46.811994 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:46.811962 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" event={"ID":"41c93375-a90d-4947-88f3-a086a4c482f7","Type":"ContainerStarted","Data":"0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3"} Apr 16 14:13:46.812188 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:46.812169 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" Apr 16 14:13:46.814187 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:46.814167 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" Apr 16 14:13:46.827504 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:46.827425 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" podStartSLOduration=0.829651415 podStartE2EDuration="1.827411484s" podCreationTimestamp="2026-04-16 14:13:45 +0000 UTC" firstStartedPulling="2026-04-16 14:13:45.580832806 +0000 UTC m=+892.916932606" lastFinishedPulling="2026-04-16 14:13:46.578592875 +0000 UTC m=+893.914692675" observedRunningTime="2026-04-16 14:13:46.826578538 +0000 UTC m=+894.162678364" watchObservedRunningTime="2026-04-16 14:13:46.827411484 +0000 UTC m=+894.163511303" Apr 16 14:13:51.783486 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:51.783447 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" podUID="faf44163-4949-45b9-aa49-f266586451eb" containerName="model-chainer-raw-hpa-3c9de" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:13:53.127858 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:53.127829 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 14:13:53.128286 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:53.127968 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 14:13:56.783358 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:56.783319 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" podUID="faf44163-4949-45b9-aa49-f266586451eb" containerName="model-chainer-raw-hpa-3c9de" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:13:56.783735 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:13:56.783438 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:14:01.783333 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:01.783295 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" podUID="faf44163-4949-45b9-aa49-f266586451eb" containerName="model-chainer-raw-hpa-3c9de" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:14:06.783572 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:06.783524 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" podUID="faf44163-4949-45b9-aa49-f266586451eb" containerName="model-chainer-raw-hpa-3c9de" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:14:11.783885 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:11.783842 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" podUID="faf44163-4949-45b9-aa49-f266586451eb" containerName="model-chainer-raw-hpa-3c9de" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:14:15.294692 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:14:15.294649 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf44163_4949_45b9_aa49_f266586451eb.slice/crio-1aa87b024d4f8b00d009e1c52ec1924b1bc7e7f51e8b49ffc3a87b9382e5ef01\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf44163_4949_45b9_aa49_f266586451eb.slice/crio-7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf44163_4949_45b9_aa49_f266586451eb.slice/crio-conmon-7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:14:15.294692 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:14:15.294674 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf44163_4949_45b9_aa49_f266586451eb.slice/crio-conmon-7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf44163_4949_45b9_aa49_f266586451eb.slice/crio-7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:14:15.295085 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:14:15.294844 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf44163_4949_45b9_aa49_f266586451eb.slice/crio-7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf44163_4949_45b9_aa49_f266586451eb.slice/crio-conmon-7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:14:15.408485 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.408464 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:14:15.444594 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.444566 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf44163-4949-45b9-aa49-f266586451eb-openshift-service-ca-bundle\") pod \"faf44163-4949-45b9-aa49-f266586451eb\" (UID: \"faf44163-4949-45b9-aa49-f266586451eb\") " Apr 16 14:14:15.444745 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.444629 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/faf44163-4949-45b9-aa49-f266586451eb-proxy-tls\") pod \"faf44163-4949-45b9-aa49-f266586451eb\" (UID: \"faf44163-4949-45b9-aa49-f266586451eb\") " Apr 16 14:14:15.444955 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.444926 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faf44163-4949-45b9-aa49-f266586451eb-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "faf44163-4949-45b9-aa49-f266586451eb" (UID: "faf44163-4949-45b9-aa49-f266586451eb"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:14:15.446645 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.446622 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf44163-4949-45b9-aa49-f266586451eb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "faf44163-4949-45b9-aa49-f266586451eb" (UID: "faf44163-4949-45b9-aa49-f266586451eb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:14:15.545551 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.545469 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/faf44163-4949-45b9-aa49-f266586451eb-proxy-tls\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:14:15.545551 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.545502 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf44163-4949-45b9-aa49-f266586451eb-openshift-service-ca-bundle\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:14:15.901988 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.901955 2575 generic.go:358] "Generic (PLEG): container finished" podID="faf44163-4949-45b9-aa49-f266586451eb" containerID="7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb" exitCode=137 Apr 16 14:14:15.902171 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.902025 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" Apr 16 14:14:15.902171 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.902043 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" event={"ID":"faf44163-4949-45b9-aa49-f266586451eb","Type":"ContainerDied","Data":"7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb"} Apr 16 14:14:15.902171 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.902083 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m" event={"ID":"faf44163-4949-45b9-aa49-f266586451eb","Type":"ContainerDied","Data":"1aa87b024d4f8b00d009e1c52ec1924b1bc7e7f51e8b49ffc3a87b9382e5ef01"} Apr 16 14:14:15.902171 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.902117 2575 scope.go:117] "RemoveContainer" containerID="7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb" Apr 16 14:14:15.909822 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.909802 2575 scope.go:117] "RemoveContainer" containerID="7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb" Apr 16 14:14:15.910054 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:14:15.910033 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb\": container with ID starting with 7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb not found: ID does not exist" containerID="7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb" Apr 16 14:14:15.910166 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.910062 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb"} err="failed to get container status \"7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb\": rpc error: code = NotFound desc = could not find container \"7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb\": container with ID starting with 7b2edaf2710d576ed142fd848f716722e45f006d407b14bc35f6250bf9c8dccb not found: ID does not exist" Apr 16 14:14:15.922436 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.922415 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m"] Apr 16 14:14:15.927829 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:15.927807 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3c9de-5fd8d44b5c-jfd5m"] Apr 16 14:14:17.206402 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:14:17.206372 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faf44163-4949-45b9-aa49-f266586451eb" path="/var/lib/kubelet/pods/faf44163-4949-45b9-aa49-f266586451eb/volumes" Apr 16 14:15:10.560531 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:10.560500 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2_41c93375-a90d-4947-88f3-a086a4c482f7/kserve-container/0.log" Apr 16 14:15:10.823619 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:10.823517 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2"] Apr 16 14:15:10.823869 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:10.823803 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" podUID="41c93375-a90d-4947-88f3-a086a4c482f7" containerName="kserve-container" containerID="cri-o://0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3" gracePeriod=30 Apr 16 14:15:11.060598 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:11.060574 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" Apr 16 14:15:11.075044 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:11.074979 2575 generic.go:358] "Generic (PLEG): container finished" podID="41c93375-a90d-4947-88f3-a086a4c482f7" containerID="0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3" exitCode=2 Apr 16 14:15:11.075044 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:11.075006 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" Apr 16 14:15:11.075044 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:11.075024 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" event={"ID":"41c93375-a90d-4947-88f3-a086a4c482f7","Type":"ContainerDied","Data":"0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3"} Apr 16 14:15:11.075276 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:11.075055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2" event={"ID":"41c93375-a90d-4947-88f3-a086a4c482f7","Type":"ContainerDied","Data":"6f7612c6b7e85bd39c603b522d7d4a3284f59bcf3b472027bd2dac99826abbc1"} Apr 16 14:15:11.075276 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:11.075072 2575 scope.go:117] "RemoveContainer" containerID="0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3" Apr 16 14:15:11.082831 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:11.082809 2575 scope.go:117] "RemoveContainer" containerID="0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3" Apr 16 14:15:11.083143 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:15:11.083079 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3\": container with ID starting with 0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3 not found: ID does not exist" containerID="0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3" Apr 16 14:15:11.083143 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:11.083130 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3"} err="failed to get container status \"0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3\": rpc error: code = NotFound desc = could not find container \"0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3\": container with ID starting with 0e4ac8ca9f3b93e6ea6e0802b987df7e6a428e39eea9997bb617fdf340d397d3 not found: ID does not exist" Apr 16 14:15:11.095744 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:11.095717 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2"] Apr 16 14:15:11.099219 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:11.099199 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a0d01-predictor-6455c7c84d-kslv2"] Apr 16 14:15:11.206336 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:15:11.206299 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c93375-a90d-4947-88f3-a086a4c482f7" path="/var/lib/kubelet/pods/41c93375-a90d-4947-88f3-a086a4c482f7/volumes" Apr 16 14:18:53.146269 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:18:53.146241 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 14:18:53.148155 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:18:53.148135 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 14:22:09.129237 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.129204 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lh5n9/must-gather-zwcnd"] Apr 16 14:22:09.129653 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.129481 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="faf44163-4949-45b9-aa49-f266586451eb" containerName="model-chainer-raw-hpa-3c9de" Apr 16 14:22:09.129653 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.129491 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf44163-4949-45b9-aa49-f266586451eb" containerName="model-chainer-raw-hpa-3c9de" Apr 16 14:22:09.129653 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.129516 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41c93375-a90d-4947-88f3-a086a4c482f7" containerName="kserve-container" Apr 16 14:22:09.129653 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.129522 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c93375-a90d-4947-88f3-a086a4c482f7" containerName="kserve-container" Apr 16 14:22:09.129653 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.129563 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="faf44163-4949-45b9-aa49-f266586451eb" containerName="model-chainer-raw-hpa-3c9de" Apr 16 14:22:09.129653 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.129570 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="41c93375-a90d-4947-88f3-a086a4c482f7" containerName="kserve-container" Apr 16 14:22:09.132579 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.132538 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lh5n9/must-gather-zwcnd" Apr 16 14:22:09.134742 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.134723 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lh5n9\"/\"openshift-service-ca.crt\"" Apr 16 14:22:09.135604 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.135587 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lh5n9\"/\"default-dockercfg-2ldmj\"" Apr 16 14:22:09.135665 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.135590 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lh5n9\"/\"kube-root-ca.crt\"" Apr 16 14:22:09.141249 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.141225 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lh5n9/must-gather-zwcnd"] Apr 16 14:22:09.150968 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.150947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zwnn\" (UniqueName: \"kubernetes.io/projected/dee60b84-c2d7-4649-bccf-260f83c63109-kube-api-access-4zwnn\") pod \"must-gather-zwcnd\" (UID: \"dee60b84-c2d7-4649-bccf-260f83c63109\") " pod="openshift-must-gather-lh5n9/must-gather-zwcnd" Apr 16 14:22:09.151140 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.150987 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dee60b84-c2d7-4649-bccf-260f83c63109-must-gather-output\") pod \"must-gather-zwcnd\" (UID: \"dee60b84-c2d7-4649-bccf-260f83c63109\") " pod="openshift-must-gather-lh5n9/must-gather-zwcnd" Apr 16 14:22:09.251596 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.251562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zwnn\" (UniqueName: \"kubernetes.io/projected/dee60b84-c2d7-4649-bccf-260f83c63109-kube-api-access-4zwnn\") pod \"must-gather-zwcnd\" (UID: \"dee60b84-c2d7-4649-bccf-260f83c63109\") " pod="openshift-must-gather-lh5n9/must-gather-zwcnd" Apr 16 14:22:09.251747 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.251603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dee60b84-c2d7-4649-bccf-260f83c63109-must-gather-output\") pod \"must-gather-zwcnd\" (UID: \"dee60b84-c2d7-4649-bccf-260f83c63109\") " pod="openshift-must-gather-lh5n9/must-gather-zwcnd" Apr 16 14:22:09.251916 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.251900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dee60b84-c2d7-4649-bccf-260f83c63109-must-gather-output\") pod \"must-gather-zwcnd\" (UID: \"dee60b84-c2d7-4649-bccf-260f83c63109\") " pod="openshift-must-gather-lh5n9/must-gather-zwcnd" Apr 16 14:22:09.263523 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.263489 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zwnn\" (UniqueName: \"kubernetes.io/projected/dee60b84-c2d7-4649-bccf-260f83c63109-kube-api-access-4zwnn\") pod \"must-gather-zwcnd\" (UID: \"dee60b84-c2d7-4649-bccf-260f83c63109\") " pod="openshift-must-gather-lh5n9/must-gather-zwcnd" Apr 16 14:22:09.442363 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.442277 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lh5n9/must-gather-zwcnd" Apr 16 14:22:09.558243 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.558125 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lh5n9/must-gather-zwcnd"] Apr 16 14:22:09.560789 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:22:09.560757 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddee60b84_c2d7_4649_bccf_260f83c63109.slice/crio-71a996982a7ea5b07c3837d225ac8430f789188af92b182a50e32cd47d4b9482 WatchSource:0}: Error finding container 71a996982a7ea5b07c3837d225ac8430f789188af92b182a50e32cd47d4b9482: Status 404 returned error can't find the container with id 71a996982a7ea5b07c3837d225ac8430f789188af92b182a50e32cd47d4b9482 Apr 16 14:22:09.562439 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:09.562424 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:22:10.377063 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:10.377027 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lh5n9/must-gather-zwcnd" event={"ID":"dee60b84-c2d7-4649-bccf-260f83c63109","Type":"ContainerStarted","Data":"71a996982a7ea5b07c3837d225ac8430f789188af92b182a50e32cd47d4b9482"} Apr 16 14:22:14.393776 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:14.393731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lh5n9/must-gather-zwcnd" event={"ID":"dee60b84-c2d7-4649-bccf-260f83c63109","Type":"ContainerStarted","Data":"d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5"} Apr 16 14:22:14.394232 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:14.393783 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lh5n9/must-gather-zwcnd" event={"ID":"dee60b84-c2d7-4649-bccf-260f83c63109","Type":"ContainerStarted","Data":"a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750"} Apr 16 14:22:14.410623 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:14.410567 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lh5n9/must-gather-zwcnd" podStartSLOduration=1.351940689 podStartE2EDuration="5.410550719s" podCreationTimestamp="2026-04-16 14:22:09 +0000 UTC" firstStartedPulling="2026-04-16 14:22:09.562546138 +0000 UTC m=+1396.898645938" lastFinishedPulling="2026-04-16 14:22:13.621156168 +0000 UTC m=+1400.957255968" observedRunningTime="2026-04-16 14:22:14.407758058 +0000 UTC m=+1401.743857881" watchObservedRunningTime="2026-04-16 14:22:14.410550719 +0000 UTC m=+1401.746650544" Apr 16 14:22:31.449408 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:31.449375 2575 generic.go:358] "Generic (PLEG): container finished" podID="dee60b84-c2d7-4649-bccf-260f83c63109" containerID="a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750" exitCode=0 Apr 16 14:22:31.449891 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:31.449435 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lh5n9/must-gather-zwcnd" event={"ID":"dee60b84-c2d7-4649-bccf-260f83c63109","Type":"ContainerDied","Data":"a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750"} Apr 16 14:22:31.449891 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:31.449745 2575 scope.go:117] "RemoveContainer" containerID="a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750" Apr 16 14:22:31.641521 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:31.641488 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lh5n9_must-gather-zwcnd_dee60b84-c2d7-4649-bccf-260f83c63109/gather/0.log" Apr 16 14:22:34.741677 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:34.741647 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-227h4_e3464c02-724a-403f-a6b4-6482e6283147/global-pull-secret-syncer/0.log" Apr 16 14:22:34.923464 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:34.923433 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mt2qp_50856c3f-d1b0-4ecc-9979-f3d585acf87b/konnectivity-agent/0.log" Apr 16 14:22:34.990760 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:34.990726 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-98.ec2.internal_067b94d6835fdbad399c48af24ac5253/haproxy/0.log" Apr 16 14:22:37.005512 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.005478 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lh5n9/must-gather-zwcnd"] Apr 16 14:22:37.006407 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.005721 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-lh5n9/must-gather-zwcnd" podUID="dee60b84-c2d7-4649-bccf-260f83c63109" containerName="copy" containerID="cri-o://d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5" gracePeriod=2 Apr 16 14:22:37.007728 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.007695 2575 status_manager.go:895] "Failed to get status for pod" podUID="dee60b84-c2d7-4649-bccf-260f83c63109" pod="openshift-must-gather-lh5n9/must-gather-zwcnd" err="pods \"must-gather-zwcnd\" is forbidden: User \"system:node:ip-10-0-130-98.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-lh5n9\": no relationship found between node 'ip-10-0-130-98.ec2.internal' and this object" Apr 16 14:22:37.007728 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.007719 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lh5n9/must-gather-zwcnd"] Apr 16 14:22:37.231556 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.231534 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lh5n9_must-gather-zwcnd_dee60b84-c2d7-4649-bccf-260f83c63109/copy/0.log" Apr 16 14:22:37.231872 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.231857 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lh5n9/must-gather-zwcnd" Apr 16 14:22:37.279334 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.279279 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dee60b84-c2d7-4649-bccf-260f83c63109-must-gather-output\") pod \"dee60b84-c2d7-4649-bccf-260f83c63109\" (UID: \"dee60b84-c2d7-4649-bccf-260f83c63109\") " Apr 16 14:22:37.279415 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.279360 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zwnn\" (UniqueName: \"kubernetes.io/projected/dee60b84-c2d7-4649-bccf-260f83c63109-kube-api-access-4zwnn\") pod \"dee60b84-c2d7-4649-bccf-260f83c63109\" (UID: \"dee60b84-c2d7-4649-bccf-260f83c63109\") " Apr 16 14:22:37.280544 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.280515 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dee60b84-c2d7-4649-bccf-260f83c63109-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dee60b84-c2d7-4649-bccf-260f83c63109" (UID: "dee60b84-c2d7-4649-bccf-260f83c63109"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:37.281530 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.281506 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee60b84-c2d7-4649-bccf-260f83c63109-kube-api-access-4zwnn" (OuterVolumeSpecName: "kube-api-access-4zwnn") pod "dee60b84-c2d7-4649-bccf-260f83c63109" (UID: "dee60b84-c2d7-4649-bccf-260f83c63109"). InnerVolumeSpecName "kube-api-access-4zwnn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:22:37.380306 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.380274 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4zwnn\" (UniqueName: \"kubernetes.io/projected/dee60b84-c2d7-4649-bccf-260f83c63109-kube-api-access-4zwnn\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:22:37.380306 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.380303 2575 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dee60b84-c2d7-4649-bccf-260f83c63109-must-gather-output\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 16 14:22:37.467576 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.467548 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lh5n9_must-gather-zwcnd_dee60b84-c2d7-4649-bccf-260f83c63109/copy/0.log" Apr 16 14:22:37.467876 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.467854 2575 generic.go:358] "Generic (PLEG): container finished" podID="dee60b84-c2d7-4649-bccf-260f83c63109" containerID="d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5" exitCode=143 Apr 16 14:22:37.467947 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.467905 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lh5n9/must-gather-zwcnd" Apr 16 14:22:37.467947 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.467933 2575 scope.go:117] "RemoveContainer" containerID="d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5" Apr 16 14:22:37.475085 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.475065 2575 scope.go:117] "RemoveContainer" containerID="a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750" Apr 16 14:22:37.486883 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.486863 2575 scope.go:117] "RemoveContainer" containerID="d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5" Apr 16 14:22:37.487119 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:22:37.487079 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5\": container with ID starting with d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5 not found: ID does not exist" containerID="d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5" Apr 16 14:22:37.487186 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.487132 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5"} err="failed to get container status \"d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5\": rpc error: code = NotFound desc = could not find container \"d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5\": container with ID starting with d245625693a866d5e2dce21f6c86f74f13b7e5674400e547273dc768f94a0bb5 not found: ID does not exist" Apr 16 14:22:37.487186 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.487158 2575 scope.go:117] "RemoveContainer" containerID="a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750" Apr 16 14:22:37.487366 ip-10-0-130-98 kubenswrapper[2575]: E0416 14:22:37.487351 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750\": container with ID starting with a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750 not found: ID does not exist" containerID="a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750" Apr 16 14:22:37.487405 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:37.487370 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750"} err="failed to get container status \"a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750\": rpc error: code = NotFound desc = could not find container \"a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750\": container with ID starting with a68bafc06b9f71d8bac6695e161d193e1a4c53e2743ba047b5e68f71a6252750 not found: ID does not exist" Apr 16 14:22:38.692383 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:38.692349 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-64pd9_e6ba7a4f-4060-490f-9b01-36f9ae7b1d10/kube-state-metrics/0.log" Apr 16 14:22:38.714374 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:38.714342 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-64pd9_e6ba7a4f-4060-490f-9b01-36f9ae7b1d10/kube-rbac-proxy-main/0.log" Apr 16 14:22:38.733366 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:38.733328 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-64pd9_e6ba7a4f-4060-490f-9b01-36f9ae7b1d10/kube-rbac-proxy-self/0.log" Apr 16 14:22:38.815169 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:38.815144 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2869s_fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445/node-exporter/0.log" Apr 16 14:22:38.834883 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:38.834858 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2869s_fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445/kube-rbac-proxy/0.log" Apr 16 14:22:38.854549 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:38.854534 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2869s_fdc2f3a2-aa77-4d1b-9e82-b2a46a5c6445/init-textfile/0.log" Apr 16 14:22:39.027750 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:39.027661 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-78656_d6354208-1c66-449f-90c8-56fb4357138b/kube-rbac-proxy-main/0.log" Apr 16 14:22:39.049236 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:39.049200 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-78656_d6354208-1c66-449f-90c8-56fb4357138b/kube-rbac-proxy-self/0.log" Apr 16 14:22:39.069348 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:39.069327 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-78656_d6354208-1c66-449f-90c8-56fb4357138b/openshift-state-metrics/0.log" Apr 16 14:22:39.206194 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:39.206164 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee60b84-c2d7-4649-bccf-260f83c63109" path="/var/lib/kubelet/pods/dee60b84-c2d7-4649-bccf-260f83c63109/volumes" Apr 16 14:22:41.447276 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.447246 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55848895b-d8sfc_8e2a7dd1-3065-42dd-af46-f0db44444e20/console/0.log" Apr 16 14:22:41.979803 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.979767 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7"] Apr 16 14:22:41.980077 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.980060 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dee60b84-c2d7-4649-bccf-260f83c63109" containerName="gather" Apr 16 14:22:41.980163 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.980077 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee60b84-c2d7-4649-bccf-260f83c63109" containerName="gather" Apr 16 14:22:41.980163 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.980088 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dee60b84-c2d7-4649-bccf-260f83c63109" containerName="copy" Apr 16 14:22:41.980163 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.980108 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee60b84-c2d7-4649-bccf-260f83c63109" containerName="copy" Apr 16 14:22:41.980256 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.980165 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dee60b84-c2d7-4649-bccf-260f83c63109" containerName="gather" Apr 16 14:22:41.980256 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.980172 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dee60b84-c2d7-4649-bccf-260f83c63109" containerName="copy" Apr 16 14:22:41.985469 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.985447 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:41.987681 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.987659 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p6v84\"/\"kube-root-ca.crt\"" Apr 16 14:22:41.988731 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.988710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-p6v84\"/\"default-dockercfg-hg4wh\"" Apr 16 14:22:41.988915 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.988891 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p6v84\"/\"openshift-service-ca.crt\"" Apr 16 14:22:41.990482 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:41.990460 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7"] Apr 16 14:22:42.116297 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.116255 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-podres\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.116297 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.116315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-proc\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.116514 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.116339 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-sys\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.116514 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.116397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-lib-modules\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.116514 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.116426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgzhb\" (UniqueName: \"kubernetes.io/projected/a9d54e83-a080-453d-9b19-7f9facc71536-kube-api-access-jgzhb\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.216839 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.216794 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-podres\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.216839 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.216852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-proc\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.217075 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.216870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-sys\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.217075 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.216895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-lib-modules\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.217075 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.216913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgzhb\" (UniqueName: \"kubernetes.io/projected/a9d54e83-a080-453d-9b19-7f9facc71536-kube-api-access-jgzhb\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.217075 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.216962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-podres\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.217075 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.216987 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-sys\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.217075 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.217002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-proc\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.217075 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.217030 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9d54e83-a080-453d-9b19-7f9facc71536-lib-modules\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.224790 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.224772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgzhb\" (UniqueName: \"kubernetes.io/projected/a9d54e83-a080-453d-9b19-7f9facc71536-kube-api-access-jgzhb\") pod \"perf-node-gather-daemonset-7gjk7\" (UID: \"a9d54e83-a080-453d-9b19-7f9facc71536\") " pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.296962 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.296869 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.416619 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.416489 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7"] Apr 16 14:22:42.419435 ip-10-0-130-98 kubenswrapper[2575]: W0416 14:22:42.419402 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda9d54e83_a080_453d_9b19_7f9facc71536.slice/crio-baf91784abb0dbbf3de59011616d338345c0928d91bd36db4e0e6f00e9d82f5b WatchSource:0}: Error finding container baf91784abb0dbbf3de59011616d338345c0928d91bd36db4e0e6f00e9d82f5b: Status 404 returned error can't find the container with id baf91784abb0dbbf3de59011616d338345c0928d91bd36db4e0e6f00e9d82f5b Apr 16 14:22:42.491192 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.491166 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" event={"ID":"a9d54e83-a080-453d-9b19-7f9facc71536","Type":"ContainerStarted","Data":"13fcf92b19fac082bf2d4d36a998efe1266ad09a80a6d176e6d2f4b777a1dd0a"} Apr 16 14:22:42.491491 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.491198 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" event={"ID":"a9d54e83-a080-453d-9b19-7f9facc71536","Type":"ContainerStarted","Data":"baf91784abb0dbbf3de59011616d338345c0928d91bd36db4e0e6f00e9d82f5b"} Apr 16 14:22:42.491491 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.491297 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:42.505052 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.505002 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" podStartSLOduration=1.5049860640000001 podStartE2EDuration="1.504986064s" podCreationTimestamp="2026-04-16 14:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:22:42.504347085 +0000 UTC m=+1429.840446907" watchObservedRunningTime="2026-04-16 14:22:42.504986064 +0000 UTC m=+1429.841085888" Apr 16 14:22:42.597496 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.597428 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zhlzs_fbe5971a-6df2-42bd-b7eb-09f552154f0d/dns/0.log" Apr 16 14:22:42.616202 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.616157 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zhlzs_fbe5971a-6df2-42bd-b7eb-09f552154f0d/kube-rbac-proxy/0.log" Apr 16 14:22:42.637753 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:42.637725 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9gkrd_e9cd8c72-8367-4db4-9cb0-b52863ecee83/dns-node-resolver/0.log" Apr 16 14:22:43.105459 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:43.105434 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-54qjm_4c1f6aa1-4340-4463-8bc6-2ce795e54be0/node-ca/0.log" Apr 16 14:22:43.894548 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:43.894519 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-b645448c4-hb2lw_6892d97c-f802-4e48-b3ea-ce73b8dbbafa/router/0.log" Apr 16 14:22:44.261628 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:44.261553 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-q9qxf_2fbaae4a-0d84-4121-bda1-d36fab54ac7d/serve-healthcheck-canary/0.log" Apr 16 14:22:44.602901 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:44.602861 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-t8smf_6daf2345-5eea-4faa-9720-9390a947e6ce/insights-operator/0.log" Apr 16 14:22:44.603380 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:44.603347 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-t8smf_6daf2345-5eea-4faa-9720-9390a947e6ce/insights-operator/1.log" Apr 16 14:22:44.761291 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:44.761266 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pwsbt_de5bb2a7-cca1-47fc-87e5-761a06491018/kube-rbac-proxy/0.log" Apr 16 14:22:44.782296 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:44.782260 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pwsbt_de5bb2a7-cca1-47fc-87e5-761a06491018/exporter/0.log" Apr 16 14:22:44.801926 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:44.801896 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pwsbt_de5bb2a7-cca1-47fc-87e5-761a06491018/extractor/0.log" Apr 16 14:22:46.645212 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:46.645179 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-75d667c7c4-knc9v_0aac9d5c-cbae-44f7-85d0-f73a2334d3ce/manager/0.log" Apr 16 14:22:46.664552 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:46.664529 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-pb5xl_4de4d1dc-3ce1-4bd7-aefc-4216fdb5313d/manager/0.log" Apr 16 14:22:48.503321 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:48.503294 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-p6v84/perf-node-gather-daemonset-7gjk7" Apr 16 14:22:50.803853 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:50.803820 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-497ks_3994a6a1-6bea-406a-a079-922dfadd77da/kube-storage-version-migrator-operator/1.log" Apr 16 14:22:50.805546 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:50.805521 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-497ks_3994a6a1-6bea-406a-a079-922dfadd77da/kube-storage-version-migrator-operator/0.log" Apr 16 14:22:51.982706 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:51.982631 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g8chm_f6c2db70-a008-4b8a-b25c-881c2d8e9809/kube-multus-additional-cni-plugins/0.log" Apr 16 14:22:52.001989 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:52.001959 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g8chm_f6c2db70-a008-4b8a-b25c-881c2d8e9809/egress-router-binary-copy/0.log" Apr 16 14:22:52.022016 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:52.021992 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g8chm_f6c2db70-a008-4b8a-b25c-881c2d8e9809/cni-plugins/0.log" Apr 16 14:22:52.042343 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:52.042279 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g8chm_f6c2db70-a008-4b8a-b25c-881c2d8e9809/bond-cni-plugin/0.log" Apr 16 14:22:52.062173 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:52.062147 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g8chm_f6c2db70-a008-4b8a-b25c-881c2d8e9809/routeoverride-cni/0.log" Apr 16 14:22:52.082588 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:52.082566 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g8chm_f6c2db70-a008-4b8a-b25c-881c2d8e9809/whereabouts-cni-bincopy/0.log" Apr 16 14:22:52.105419 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:52.105389 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g8chm_f6c2db70-a008-4b8a-b25c-881c2d8e9809/whereabouts-cni/0.log" Apr 16 14:22:52.316989 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:52.316911 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rts4v_553f6f6d-061c-4a9d-9ef4-1bbfedfede51/kube-multus/0.log" Apr 16 14:22:52.454970 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:52.454940 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ckxrz_0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e/network-metrics-daemon/0.log" Apr 16 14:22:52.476786 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:52.476762 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ckxrz_0b32b297-b5cc-4ce8-8fba-75b7d4a02b3e/kube-rbac-proxy/0.log" Apr 16 14:22:53.265755 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:53.265730 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-controller/0.log" Apr 16 14:22:53.281874 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:53.281843 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/0.log" Apr 16 14:22:53.294727 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:53.294707 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovn-acl-logging/1.log" Apr 16 14:22:53.316458 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:53.316433 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/kube-rbac-proxy-node/0.log" Apr 16 14:22:53.341438 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:53.341417 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:22:53.358834 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:53.358813 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/northd/0.log" Apr 16 14:22:53.377071 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:53.377049 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/nbdb/0.log" Apr 16 14:22:53.397669 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:53.397649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/sbdb/0.log" Apr 16 14:22:53.557223 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:53.557124 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cdqzf_a2aa2b87-716c-4b0a-abde-a69d0c373e83/ovnkube-controller/0.log" Apr 16 14:22:55.140515 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:55.140472 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xtl4r_008eaa56-e238-402c-a6f7-ded4fd1a1572/network-check-target-container/0.log" Apr 16 14:22:56.069782 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:56.069757 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-r8xw2_60a323f7-d2f9-401b-bb9d-cde57adebc7d/iptables-alerter/0.log" Apr 16 14:22:56.694078 ip-10-0-130-98 kubenswrapper[2575]: I0416 14:22:56.694048 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8wwv4_a9cea359-15c5-46e5-af4f-1f2198fc5b08/tuned/0.log"