Apr 17 17:26:22.155835 ip-10-0-139-96 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:26:22.602645 ip-10-0-139-96 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:26:22.602645 ip-10-0-139-96 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:26:22.602645 ip-10-0-139-96 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:26:22.602645 ip-10-0-139-96 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:26:22.602645 ip-10-0-139-96 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:26:22.604498 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.604402 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:26:22.608530 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608513 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:22.608530 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608530 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608534 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608537 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608540 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608544 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608546 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608551 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608553 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608556 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608558 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608561 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608565 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608568 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608570 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608573 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608575 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608578 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608580 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608583 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:22.608595 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608586 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608588 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608591 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608594 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608596 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608599 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608602 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608605 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608608 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608611 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608613 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608616 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608618 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608621 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608623 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608626 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608628 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608631 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608634 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608636 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608638 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:22.609105 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608641 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608643 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608646 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608648 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608651 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608653 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608657 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608659 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608661 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608664 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608666 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608669 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608671 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608674 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608677 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608680 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608682 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608685 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608687 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:22.609655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608689 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608692 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608694 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608697 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608700 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608702 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608704 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608707 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608710 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608713 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608715 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608719 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608723 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608726 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608729 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608731 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608734 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608739 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608744 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:22.610099 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608747 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608750 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608753 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608755 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608758 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608760 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.608763 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609141 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609145 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609148 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609150 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609153 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609156 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609158 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609161 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609163 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609166 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609168 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609171 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609174 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:22.610561 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609176 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609178 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609181 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609184 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609186 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609189 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609191 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609194 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609196 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609198 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609201 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609204 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609207 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609209 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609212 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609214 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609217 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609219 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609221 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609224 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:22.611064 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609227 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609229 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609232 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609235 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609237 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609240 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609242 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609245 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609247 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609250 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609252 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609254 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609257 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609260 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609262 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609265 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609267 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609269 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609272 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:22.611763 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609274 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609277 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609279 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609281 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609284 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609290 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609292 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609295 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609297 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609299 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609302 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609304 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609307 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609310 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609312 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609315 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609317 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609319 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609322 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609324 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:22.612482 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609326 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609329 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609331 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609334 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609338 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609341 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609344 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609347 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609349 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609352 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609354 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609356 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609359 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.609363 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610630 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610639 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610645 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610650 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610654 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610658 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:26:22.613060 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610663 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610667 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610671 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610674 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610679 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610683 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610686 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610689 2577 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610692 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610695 2577 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610698 2577 flags.go:64] FLAG: --cloud-config="" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610701 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610704 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610708 2577 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610710 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610713 2577 flags.go:64] FLAG: --config-dir="" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610716 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610720 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610725 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610728 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610731 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610735 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610738 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610741 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:26:22.613648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610743 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610747 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610750 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610754 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610757 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610760 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610763 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610766 2577 flags.go:64] FLAG: --enable-server="true" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610769 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610774 2577 flags.go:64] FLAG: --event-burst="100" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610777 2577 flags.go:64] FLAG: --event-qps="50" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610780 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610783 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610786 2577 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610790 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610793 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610796 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610800 2577 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610811 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610814 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610817 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610820 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610824 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610826 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610829 2577 flags.go:64] FLAG: --feature-gates="" Apr 17 17:26:22.614217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610834 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610837 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610858 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610862 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610865 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610868 2577 flags.go:64] FLAG: --help="false" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610871 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-139-96.ec2.internal" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610874 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610877 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610880 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610884 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610888 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610892 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610895 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610897 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610901 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610904 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610908 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610911 2577 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610914 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610917 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610920 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610923 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610927 2577 flags.go:64] FLAG: --lock-file="" Apr 17 17:26:22.614874 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610930 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610933 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610936 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610941 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610944 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610947 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610950 2577 flags.go:64] FLAG: --logging-format="text" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610953 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610956 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610959 2577 flags.go:64] FLAG: --manifest-url="" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610962 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610966 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610969 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610974 2577 flags.go:64] FLAG: --max-pods="110" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610977 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610980 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610983 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610986 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610989 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610992 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.610996 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611004 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611007 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611010 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:26:22.615457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611014 2577 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611017 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611023 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611026 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611030 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611033 2577 flags.go:64] FLAG: --port="10250" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611036 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611039 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0390218407ee10ad0" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611042 2577 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611045 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611048 2577 flags.go:64] FLAG: --register-node="true" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611051 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611054 2577 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611058 2577 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611061 2577 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611063 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611066 2577 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611074 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611077 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611080 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611083 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611086 2577 flags.go:64] FLAG: --runonce="false" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611089 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611092 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611095 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:26:22.616037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611099 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611102 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611105 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611109 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611112 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611115 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611119 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611122 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611125 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611128 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611131 2577 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611134 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611140 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611143 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611146 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611149 2577 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611152 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611155 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611158 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611161 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611164 2577 flags.go:64] FLAG: --v="2" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611168 2577 flags.go:64] FLAG: --version="false" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611173 2577 flags.go:64] FLAG: --vmodule="" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611177 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.611180 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:26:22.616681 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613071 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613088 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613092 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613097 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613103 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613106 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613109 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613112 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613115 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613118 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613121 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613124 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613127 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613129 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613132 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613135 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613137 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613141 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613143 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613147 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:22.617309 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613149 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613152 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613154 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613157 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613159 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613162 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613166 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613168 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613171 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613174 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613177 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613179 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613183 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613189 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613192 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613196 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613199 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613202 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613204 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:22.617818 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613207 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613210 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613212 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613215 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613217 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613220 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613223 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613226 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613228 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613231 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613233 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613236 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613238 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613242 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613245 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613247 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613250 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613253 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613255 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613258 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:22.618274 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613261 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613263 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613266 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613268 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613271 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613273 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613277 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613280 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613282 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613286 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613288 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613291 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613294 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613296 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613299 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613301 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613303 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613306 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613309 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613312 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:22.618762 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613315 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:22.619299 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613318 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:22.619299 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613320 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:22.619299 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613323 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:22.619299 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613325 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:22.619299 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613328 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:22.619299 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.613330 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:22.619299 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.614002 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:26:22.620534 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.620513 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:26:22.620576 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.620535 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:26:22.620607 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620586 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:22.620607 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620591 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:22.620607 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620595 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:22.620607 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620598 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:22.620607 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620602 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:22.620607 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620607 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:22.620607 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620611 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620614 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620617 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620620 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620624 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620626 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620629 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620631 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620634 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620637 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620639 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620642 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620644 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620647 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620649 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620653 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620656 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620659 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620661 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620664 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:22.620821 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620667 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620669 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620671 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620674 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620676 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620679 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620682 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620684 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620687 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620689 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620692 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620694 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620697 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620699 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620702 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620706 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620708 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620711 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620714 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620716 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:22.621287 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620719 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620721 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620725 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620729 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620733 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620735 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620738 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620741 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620744 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620747 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620749 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620752 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620754 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620757 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620759 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620762 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620764 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620767 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620770 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620773 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:22.621783 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620775 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620778 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620780 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620783 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620786 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620789 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620791 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620794 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620796 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620799 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620802 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620804 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620807 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620809 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620812 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620815 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620817 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620820 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620822 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:22.622270 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620825 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.620830 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620939 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620943 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620947 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620950 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620953 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620956 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620959 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620961 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620964 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620967 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620970 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620973 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620975 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620978 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:22.622736 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620981 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620983 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620986 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620989 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620992 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620994 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.620997 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621000 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621003 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621005 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621008 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621010 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621013 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621015 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621018 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621020 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621022 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621025 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621027 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621031 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:22.623175 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621035 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621038 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621042 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621046 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621049 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621053 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621055 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621058 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621060 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621063 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621065 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621068 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621070 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621073 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621076 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621079 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621082 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621084 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621087 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621089 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:22.623680 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621092 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621094 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621097 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621099 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621102 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621104 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621106 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621109 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621111 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621114 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621116 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621119 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621122 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621125 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621127 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621130 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621132 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621135 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621137 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621139 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:22.624173 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621142 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621145 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621147 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621149 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621152 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621154 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621157 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621160 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621162 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621165 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621168 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:22.621170 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.621175 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.621777 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:26:22.624653 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.623709 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:26:22.624989 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.624755 2577 server.go:1019] "Starting client certificate rotation" Apr 17 17:26:22.624989 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.624857 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:26:22.624989 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.624901 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:26:22.653351 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.653325 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:26:22.657511 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.657481 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:26:22.674637 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.674611 2577 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:26:22.681013 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.680991 2577 log.go:25] "Validated CRI v1 image API" Apr 17 17:26:22.681453 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.681433 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:26:22.682216 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.682194 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:26:22.686360 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.686333 2577 fs.go:135] Filesystem UUIDs: map[65f7a035-4917-4621-90e7-c7febacc270a:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b461784c-7a6c-4c2a-aee2-1a83a8dbcc69:/dev/nvme0n1p4] Apr 17 17:26:22.686443 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.686358 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:26:22.692504 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.692359 2577 manager.go:217] Machine: {Timestamp:2026-04-17 17:26:22.690193676 +0000 UTC m=+0.412136666 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3202390 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c05b7fbb4f51c3b32467161973ce9 SystemUUID:ec2c05b7-fbb4-f51c-3b32-467161973ce9 BootID:9c49ccbf-82e8-40f1-9b19-c56da2a7b1d9 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9b:02:b8:ba:83 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9b:02:b8:ba:83 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fa:9e:ec:a4:79:33 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:26:22.692504 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.692484 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:26:22.692688 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.692611 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:26:22.693839 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.693804 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:26:22.694016 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.693840 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-96.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:26:22.694095 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.694030 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:26:22.694095 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.694044 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:26:22.694095 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.694063 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:26:22.694095 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.694084 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:26:22.695335 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.695319 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:26:22.695458 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.695447 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:26:22.698284 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.698272 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:26:22.698320 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.698295 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:26:22.698320 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.698309 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:26:22.698320 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.698319 2577 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:26:22.698402 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.698329 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:26:22.699586 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.699572 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:26:22.699632 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.699592 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:26:22.702137 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.702116 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j9qgf" Apr 17 17:26:22.702717 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.702701 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:26:22.703958 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.703943 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:26:22.705621 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705608 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:26:22.705667 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705626 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:26:22.705667 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705635 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:26:22.705667 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705645 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:26:22.705667 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705653 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:26:22.705667 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705659 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:26:22.705667 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705665 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:26:22.705823 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705671 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:26:22.705823 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705689 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:26:22.705823 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705696 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:26:22.705823 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705713 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:26:22.705823 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.705722 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:26:22.706660 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.706649 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:26:22.706696 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.706662 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:26:22.709587 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.709567 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j9qgf" Apr 17 17:26:22.710711 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.710690 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-96.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:26:22.710815 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.710714 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:26:22.710815 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.710753 2577 server.go:1295] "Started kubelet" Apr 17 17:26:22.711017 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.710989 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:26:22.711299 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.711249 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:26:22.711403 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.711317 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:26:22.711745 ip-10-0-139-96 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:26:22.712118 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.712060 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:26:22.712644 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.712563 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-96.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:26:22.713133 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.713115 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:26:22.714996 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.714980 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:26:22.718257 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.718233 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:26:22.718257 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.718245 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:26:22.719788 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.719768 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:26:22.719904 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.719785 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:26:22.719997 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.719988 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:26:22.720120 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.720004 2577 factory.go:153] Registering CRI-O factory Apr 17 17:26:22.720226 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.720125 2577 factory.go:223] Registration of the crio container factory successfully Apr 17 17:26:22.720226 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.720166 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:26:22.720226 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.720176 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:26:22.720226 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.720194 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:26:22.720226 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.720205 2577 factory.go:55] Registering systemd factory Apr 17 17:26:22.720226 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.720211 2577 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:26:22.720226 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.720230 2577 factory.go:103] Registering Raw factory Apr 17 17:26:22.720565 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.720244 2577 manager.go:1196] Started watching for new ooms in manager Apr 17 17:26:22.720565 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.720547 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 17 17:26:22.721000 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.720980 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:22.721078 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.721008 2577 manager.go:319] Starting recovery of all containers Apr 17 17:26:22.721956 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.721930 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:26:22.723840 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.723814 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-96.ec2.internal\" not found" node="ip-10-0-139-96.ec2.internal" Apr 17 17:26:22.730744 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.730574 2577 manager.go:324] Recovery completed Apr 17 17:26:22.735064 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.735052 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:22.737314 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.737298 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:22.737380 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.737331 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:22.737380 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.737345 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:22.737827 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.737809 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:26:22.737827 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.737825 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:26:22.737926 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.737842 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:26:22.740876 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.740864 2577 policy_none.go:49] "None policy: Start" Apr 17 17:26:22.740912 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.740883 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:26:22.740912 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.740893 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:26:22.783214 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.783196 2577 manager.go:341] "Starting Device Plugin manager" Apr 17 17:26:22.801961 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.783238 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:26:22.801961 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.783248 2577 server.go:85] "Starting device plugin registration server" Apr 17 17:26:22.801961 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.783544 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:26:22.801961 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.783556 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:26:22.801961 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.783637 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:26:22.801961 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.783720 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:26:22.801961 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.783744 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:26:22.801961 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.784355 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:26:22.801961 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.784388 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-96.ec2.internal\" not found" Apr 17 17:26:22.849030 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.848993 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:26:22.850151 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.850137 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:26:22.850227 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.850167 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:26:22.850227 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.850188 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:26:22.850227 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.850195 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:26:22.850227 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.850224 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:26:22.852523 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.852497 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:22.884301 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.884217 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:22.885118 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.885098 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:22.885222 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.885132 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:22.885222 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.885142 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:22.886946 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.885498 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-96.ec2.internal" Apr 17 17:26:22.891888 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.891872 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-96.ec2.internal" Apr 17 17:26:22.891932 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.891897 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-96.ec2.internal\": node \"ip-10-0-139-96.ec2.internal\" not found" Apr 17 17:26:22.902938 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.902918 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 17 17:26:22.951144 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.951093 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal"] Apr 17 17:26:22.951224 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.951197 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:22.952700 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.952684 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:22.952765 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.952716 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:22.952765 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.952726 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:22.955107 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.955093 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:22.955238 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.955223 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 17 17:26:22.955271 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.955258 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:22.955935 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.955921 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:22.955935 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.955926 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:22.956059 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.955945 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:22.956059 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.955951 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:22.956059 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.955955 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:22.956059 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.955965 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:22.958154 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.958140 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 17 17:26:22.958228 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.958164 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:22.958950 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.958933 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:22.959041 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.958962 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:22.959041 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:22.958972 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:22.981306 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.981276 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-96.ec2.internal\" not found" node="ip-10-0-139-96.ec2.internal" Apr 17 17:26:22.985791 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:22.985770 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-96.ec2.internal\" not found" node="ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.003835 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.003795 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 17 17:26:23.021660 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.021628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6318a6bfadd7dc1dc0c05b611179f194-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"6318a6bfadd7dc1dc0c05b611179f194\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.021660 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.021668 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6318a6bfadd7dc1dc0c05b611179f194-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"6318a6bfadd7dc1dc0c05b611179f194\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.021872 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.021694 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/581fa9c88cb33b3a66c7bdd6f4dd1862-config\") pod \"kube-apiserver-proxy-ip-10-0-139-96.ec2.internal\" (UID: \"581fa9c88cb33b3a66c7bdd6f4dd1862\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.104455 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.104415 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 17 17:26:23.121887 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.121864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6318a6bfadd7dc1dc0c05b611179f194-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"6318a6bfadd7dc1dc0c05b611179f194\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.121965 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.121890 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6318a6bfadd7dc1dc0c05b611179f194-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"6318a6bfadd7dc1dc0c05b611179f194\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.121965 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.121910 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/581fa9c88cb33b3a66c7bdd6f4dd1862-config\") pod \"kube-apiserver-proxy-ip-10-0-139-96.ec2.internal\" (UID: \"581fa9c88cb33b3a66c7bdd6f4dd1862\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.121965 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.121948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/581fa9c88cb33b3a66c7bdd6f4dd1862-config\") pod \"kube-apiserver-proxy-ip-10-0-139-96.ec2.internal\" (UID: \"581fa9c88cb33b3a66c7bdd6f4dd1862\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.122082 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.121972 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6318a6bfadd7dc1dc0c05b611179f194-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"6318a6bfadd7dc1dc0c05b611179f194\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.122082 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.121977 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6318a6bfadd7dc1dc0c05b611179f194-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal\" (UID: \"6318a6bfadd7dc1dc0c05b611179f194\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.205393 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.205314 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 17 17:26:23.282946 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.282905 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.288569 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.288541 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.306086 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.306057 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 17 17:26:23.406649 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.406603 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 17 17:26:23.507300 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.507221 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 17 17:26:23.607812 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.607775 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-96.ec2.internal\" not found" Apr 17 17:26:23.624216 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.624193 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:26:23.624359 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.624333 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:26:23.624402 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.624363 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:26:23.665480 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.665438 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:23.699295 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.699262 2577 apiserver.go:52] "Watching apiserver" Apr 17 17:26:23.707364 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.707337 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:26:23.709643 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.709619 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk","openshift-dns/node-resolver-bjxzb","openshift-image-registry/node-ca-gb2kr","openshift-multus/multus-rks6d","openshift-network-diagnostics/network-check-target-d4gnk","kube-system/konnectivity-agent-sn554","openshift-cluster-node-tuning-operator/tuned-k2ngp","openshift-multus/multus-additional-cni-plugins-pdpfh","openshift-multus/network-metrics-daemon-p9f9z","openshift-network-operator/iptables-alerter-4p4f7","openshift-ovn-kubernetes/ovnkube-node-8rjgx"] Apr 17 17:26:23.711041 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.711010 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:21:22 +0000 UTC" deadline="2027-09-18 22:57:48.545551651 +0000 UTC" Apr 17 17:26:23.711041 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.711039 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12461h31m24.834514612s" Apr 17 17:26:23.712815 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.712797 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.715180 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.715159 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:26:23.715180 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.715170 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:26:23.715355 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.715173 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5p7zn\"" Apr 17 17:26:23.715355 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.715265 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bjxzb" Apr 17 17:26:23.715458 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.715447 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:26:23.717292 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.717235 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:26:23.718906 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.717429 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:26:23.718906 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.717505 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-862sf\"" Apr 17 17:26:23.718906 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.718728 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.718906 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.718902 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:26:23.720978 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.720955 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gb2kr" Apr 17 17:26:23.723026 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.723008 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:26:23.723222 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.723205 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kltqm\"" Apr 17 17:26:23.723353 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.723338 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:26:23.723434 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.723420 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:26:23.724091 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.724075 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.725834 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.725820 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:23.725922 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.725852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.725922 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.725880 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-device-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.725922 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.725906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca904b14-b665-4107-bf21-c1783df952e4-hosts-file\") pod \"node-resolver-bjxzb\" (UID: \"ca904b14-b665-4107-bf21-c1783df952e4\") " pod="openshift-dns/node-resolver-bjxzb" Apr 17 17:26:23.726067 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.725938 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/255274fc-6f71-45da-a2f9-c715044eee61-serviceca\") pod \"node-ca-gb2kr\" (UID: \"255274fc-6f71-45da-a2f9-c715044eee61\") " pod="openshift-image-registry/node-ca-gb2kr" Apr 17 17:26:23.726067 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.725954 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.726067 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.725992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-system-cni-dir\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726067 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726029 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-etc-selinux\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.726067 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c57n8\" (UniqueName: \"kubernetes.io/projected/ca904b14-b665-4107-bf21-c1783df952e4-kube-api-access-c57n8\") pod \"node-resolver-bjxzb\" (UID: \"ca904b14-b665-4107-bf21-c1783df952e4\") " pod="openshift-dns/node-resolver-bjxzb" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726112 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-var-lib-cni-bin\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726146 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-daemon-config\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726186 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726189 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-registration-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726225 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/255274fc-6f71-45da-a2f9-c715044eee61-host\") pod \"node-ca-gb2kr\" (UID: \"255274fc-6f71-45da-a2f9-c715044eee61\") " pod="openshift-image-registry/node-ca-gb2kr" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726240 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726240 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726255 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-os-release\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c447c5b4-4c37-4d50-8c77-2633c36d977d-cni-binary-copy\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726285 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726320 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-run-k8s-cni-cncf-io\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726353 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-var-lib-cni-multus\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726383 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nv64\" (UniqueName: \"kubernetes.io/projected/af0d80e7-5925-429c-8bd3-f0235981720a-kube-api-access-4nv64\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726419 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-cni-dir\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-run-multus-certs\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726435 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d6cjm\"" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-socket-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-sys-fs\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ca904b14-b665-4107-bf21-c1783df952e4-tmp-dir\") pod \"node-resolver-bjxzb\" (UID: \"ca904b14-b665-4107-bf21-c1783df952e4\") " pod="openshift-dns/node-resolver-bjxzb" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726572 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-socket-dir-parent\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726602 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-conf-dir\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726636 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-etc-kubernetes\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726662 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-cnibin\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77kc5\" (UniqueName: \"kubernetes.io/projected/c447c5b4-4c37-4d50-8c77-2633c36d977d-kube-api-access-77kc5\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b22vb\" (UniqueName: \"kubernetes.io/projected/255274fc-6f71-45da-a2f9-c715044eee61-kube-api-access-b22vb\") pod \"node-ca-gb2kr\" (UID: \"255274fc-6f71-45da-a2f9-c715044eee61\") " pod="openshift-image-registry/node-ca-gb2kr" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726747 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-var-lib-kubelet\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726773 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-run-netns\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.726796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.726794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-hostroot\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.727871 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.727855 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.728181 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.727973 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7s5z8\"" Apr 17 17:26:23.728181 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.728034 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:26:23.728353 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.728330 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:26:23.728353 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.728317 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-np6vd\"" Apr 17 17:26:23.728453 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.728330 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:26:23.728453 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.728430 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:26:23.729839 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.729775 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:26:23.730050 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.730037 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:23.730119 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.730096 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:23.730166 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.730130 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:26:23.730223 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.730189 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xk74t\"" Apr 17 17:26:23.730895 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.730881 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:26:23.730959 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.730942 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" Apr 17 17:26:23.732561 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.732543 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4p4f7" Apr 17 17:26:23.734916 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.734896 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.735018 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.734960 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ljwph\"" Apr 17 17:26:23.735018 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.734974 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:26:23.735122 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.734964 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:26:23.735361 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.735341 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:26:23.736886 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.736863 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal"] Apr 17 17:26:23.736978 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.736950 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:23.737039 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.737001 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:23.737101 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.737044 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:26:23.737149 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.737090 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:26:23.737238 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.737223 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:26:23.737294 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.737236 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:26:23.737294 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.737274 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:26:23.737504 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.737492 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:26:23.737565 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.737546 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:26:23.737670 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.737549 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-phpj9\"" Apr 17 17:26:23.739508 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.739491 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:26:23.739672 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.739654 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal"] Apr 17 17:26:23.758993 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.758925 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-q6lvh" Apr 17 17:26:23.768703 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.768682 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-q6lvh" Apr 17 17:26:23.821530 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.821513 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:26:23.827424 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-var-lib-cni-bin\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.827522 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-host\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.827522 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827452 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-systemd-units\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.827522 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/255274fc-6f71-45da-a2f9-c715044eee61-host\") pod \"node-ca-gb2kr\" (UID: \"255274fc-6f71-45da-a2f9-c715044eee61\") " pod="openshift-image-registry/node-ca-gb2kr" Apr 17 17:26:23.827522 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-var-lib-cni-bin\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.827522 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c447c5b4-4c37-4d50-8c77-2633c36d977d-cni-binary-copy\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.827713 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827535 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/255274fc-6f71-45da-a2f9-c715044eee61-host\") pod \"node-ca-gb2kr\" (UID: \"255274fc-6f71-45da-a2f9-c715044eee61\") " pod="openshift-image-registry/node-ca-gb2kr" Apr 17 17:26:23.827713 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-var-lib-cni-multus\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.827713 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827580 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-var-lib-cni-multus\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.827713 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827580 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs4pm\" (UniqueName: \"kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm\") pod \"network-check-target-d4gnk\" (UID: \"f5fc0e79-0e28-4fcc-891b-18afdb313f11\") " pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:23.827713 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-sysctl-d\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.827713 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827631 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-system-cni-dir\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.827713 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-cnibin\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.827713 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827691 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-cni-netd\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.827940 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827720 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-run-multus-certs\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.827940 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14b15c97-09e7-4ee9-87b2-c15fc332baf7-host-slash\") pod \"iptables-alerter-4p4f7\" (UID: \"14b15c97-09e7-4ee9-87b2-c15fc332baf7\") " pod="openshift-network-operator/iptables-alerter-4p4f7" Apr 17 17:26:23.827940 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-cni-bin\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.827940 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-run-multus-certs\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.827940 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827809 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-socket-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.827940 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-conf-dir\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.827940 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827875 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-modprobe-d\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.827940 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-sysconfig\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.827940 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827913 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-socket-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.827940 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827940 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-kubernetes\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-conf-dir\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-tuned\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.827990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-run-systemd\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828015 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-var-lib-openvswitch\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-cnibin\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828079 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-sysctl-conf\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-cnibin\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c447c5b4-4c37-4d50-8c77-2633c36d977d-cni-binary-copy\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-slash\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-log-socket\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-run-ovn-kubernetes\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828204 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/49a176d5-a780-4a38-b16f-90dc62742d5d-env-overrides\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828231 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-var-lib-kubelet\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.828256 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828246 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92tt8\" (UniqueName: \"kubernetes.io/projected/7f3920b2-d64c-4102-8d86-665e6ea60718-kube-api-access-92tt8\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-var-lib-kubelet\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29wzd\" (UniqueName: \"kubernetes.io/projected/49a176d5-a780-4a38-b16f-90dc62742d5d-kube-api-access-29wzd\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828294 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828308 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14b15c97-09e7-4ee9-87b2-c15fc332baf7-iptables-alerter-script\") pod \"iptables-alerter-4p4f7\" (UID: \"14b15c97-09e7-4ee9-87b2-c15fc332baf7\") " pod="openshift-network-operator/iptables-alerter-4p4f7" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828324 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-device-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca904b14-b665-4107-bf21-c1783df952e4-hosts-file\") pod \"node-resolver-bjxzb\" (UID: \"ca904b14-b665-4107-bf21-c1783df952e4\") " pod="openshift-dns/node-resolver-bjxzb" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/255274fc-6f71-45da-a2f9-c715044eee61-serviceca\") pod \"node-ca-gb2kr\" (UID: \"255274fc-6f71-45da-a2f9-c715044eee61\") " pod="openshift-image-registry/node-ca-gb2kr" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-device-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-system-cni-dir\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828384 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-var-lib-kubelet\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqwfv\" (UniqueName: \"kubernetes.io/projected/bcb4d874-10b6-4167-b452-800ed19b3f79-kube-api-access-gqwfv\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828401 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca904b14-b665-4107-bf21-c1783df952e4-hosts-file\") pod \"node-resolver-bjxzb\" (UID: \"ca904b14-b665-4107-bf21-c1783df952e4\") " pod="openshift-dns/node-resolver-bjxzb" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828428 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-run-netns\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828458 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/49a176d5-a780-4a38-b16f-90dc62742d5d-ovn-node-metrics-cert\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-etc-selinux\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.828661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828462 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-system-cni-dir\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c57n8\" (UniqueName: \"kubernetes.io/projected/ca904b14-b665-4107-bf21-c1783df952e4-kube-api-access-c57n8\") pod \"node-resolver-bjxzb\" (UID: \"ca904b14-b665-4107-bf21-c1783df952e4\") " pod="openshift-dns/node-resolver-bjxzb" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828545 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-daemon-config\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828572 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-sys\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/355b3a4d-4123-4e80-a76f-e42bcfb92020-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-etc-selinux\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-etc-openvswitch\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-registration-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-registration-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-os-release\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-run-k8s-cni-cncf-io\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828743 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nv64\" (UniqueName: \"kubernetes.io/projected/af0d80e7-5925-429c-8bd3-f0235981720a-kube-api-access-4nv64\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828769 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/255274fc-6f71-45da-a2f9-c715044eee61-serviceca\") pod \"node-ca-gb2kr\" (UID: \"255274fc-6f71-45da-a2f9-c715044eee61\") " pod="openshift-image-registry/node-ca-gb2kr" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-cni-dir\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-os-release\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/355b3a4d-4123-4e80-a76f-e42bcfb92020-cni-binary-copy\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-run-k8s-cni-cncf-io\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828828 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-cni-dir\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828846 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-node-log\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-sys-fs\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ca904b14-b665-4107-bf21-c1783df952e4-tmp-dir\") pod \"node-resolver-bjxzb\" (UID: \"ca904b14-b665-4107-bf21-c1783df952e4\") " pod="openshift-dns/node-resolver-bjxzb" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-socket-dir-parent\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-etc-kubernetes\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828933 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-sys-fs\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828949 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-daemon-config\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/355b3a4d-4123-4e80-a76f-e42bcfb92020-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.828984 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-multus-socket-dir-parent\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-etc-kubernetes\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829000 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn7s8\" (UniqueName: \"kubernetes.io/projected/355b3a4d-4123-4e80-a76f-e42bcfb92020-kube-api-access-dn7s8\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-run-ovn\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829067 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77kc5\" (UniqueName: \"kubernetes.io/projected/c447c5b4-4c37-4d50-8c77-2633c36d977d-kube-api-access-77kc5\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829139 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f3920b2-d64c-4102-8d86-665e6ea60718-tmp\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-kubelet\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.829927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ca904b14-b665-4107-bf21-c1783df952e4-tmp-dir\") pod \"node-resolver-bjxzb\" (UID: \"ca904b14-b665-4107-bf21-c1783df952e4\") " pod="openshift-dns/node-resolver-bjxzb" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829198 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-run-openvswitch\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b22vb\" (UniqueName: \"kubernetes.io/projected/255274fc-6f71-45da-a2f9-c715044eee61-kube-api-access-b22vb\") pod \"node-ca-gb2kr\" (UID: \"255274fc-6f71-45da-a2f9-c715044eee61\") " pod="openshift-image-registry/node-ca-gb2kr" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829283 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/51db2efc-3b63-4a21-bb04-99caab75c450-konnectivity-ca\") pod \"konnectivity-agent-sn554\" (UID: \"51db2efc-3b63-4a21-bb04-99caab75c450\") " pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829308 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-run\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829341 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-os-release\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829365 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t7cb\" (UniqueName: \"kubernetes.io/projected/14b15c97-09e7-4ee9-87b2-c15fc332baf7-kube-api-access-6t7cb\") pod \"iptables-alerter-4p4f7\" (UID: \"14b15c97-09e7-4ee9-87b2-c15fc332baf7\") " pod="openshift-network-operator/iptables-alerter-4p4f7" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829390 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/49a176d5-a780-4a38-b16f-90dc62742d5d-ovnkube-config\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-run-netns\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-hostroot\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/51db2efc-3b63-4a21-bb04-99caab75c450-agent-certs\") pod \"konnectivity-agent-sn554\" (UID: \"51db2efc-3b63-4a21-bb04-99caab75c450\") " pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-host-run-netns\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829513 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-systemd\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829538 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-lib-modules\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c447c5b4-4c37-4d50-8c77-2633c36d977d-hostroot\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/49a176d5-a780-4a38-b16f-90dc62742d5d-ovnkube-script-lib\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.830368 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.829688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af0d80e7-5925-429c-8bd3-f0235981720a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.838338 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.838231 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:26:23.842289 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.842233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nv64\" (UniqueName: \"kubernetes.io/projected/af0d80e7-5925-429c-8bd3-f0235981720a-kube-api-access-4nv64\") pod \"aws-ebs-csi-driver-node-sjxlk\" (UID: \"af0d80e7-5925-429c-8bd3-f0235981720a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:23.842422 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.842235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b22vb\" (UniqueName: \"kubernetes.io/projected/255274fc-6f71-45da-a2f9-c715044eee61-kube-api-access-b22vb\") pod \"node-ca-gb2kr\" (UID: \"255274fc-6f71-45da-a2f9-c715044eee61\") " pod="openshift-image-registry/node-ca-gb2kr" Apr 17 17:26:23.843899 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.843686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77kc5\" (UniqueName: \"kubernetes.io/projected/c447c5b4-4c37-4d50-8c77-2633c36d977d-kube-api-access-77kc5\") pod \"multus-rks6d\" (UID: \"c447c5b4-4c37-4d50-8c77-2633c36d977d\") " pod="openshift-multus/multus-rks6d" Apr 17 17:26:23.843899 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.843885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c57n8\" (UniqueName: \"kubernetes.io/projected/ca904b14-b665-4107-bf21-c1783df952e4-kube-api-access-c57n8\") pod \"node-resolver-bjxzb\" (UID: \"ca904b14-b665-4107-bf21-c1783df952e4\") " pod="openshift-dns/node-resolver-bjxzb" Apr 17 17:26:23.854915 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:23.854877 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6318a6bfadd7dc1dc0c05b611179f194.slice/crio-1e0415c7ab4a28a5b31126d623ea7491557d0ce87592d384ce794adf75856e5a WatchSource:0}: Error finding container 1e0415c7ab4a28a5b31126d623ea7491557d0ce87592d384ce794adf75856e5a: Status 404 returned error can't find the container with id 1e0415c7ab4a28a5b31126d623ea7491557d0ce87592d384ce794adf75856e5a Apr 17 17:26:23.855203 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:23.855178 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod581fa9c88cb33b3a66c7bdd6f4dd1862.slice/crio-86979263ca58b93a88db714c52ebd85bb2793d6813211e07eb5da8e4a665c792 WatchSource:0}: Error finding container 86979263ca58b93a88db714c52ebd85bb2793d6813211e07eb5da8e4a665c792: Status 404 returned error can't find the container with id 86979263ca58b93a88db714c52ebd85bb2793d6813211e07eb5da8e4a665c792 Apr 17 17:26:23.858947 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.858926 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:26:23.930592 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/355b3a4d-4123-4e80-a76f-e42bcfb92020-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.930592 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn7s8\" (UniqueName: \"kubernetes.io/projected/355b3a4d-4123-4e80-a76f-e42bcfb92020-kube-api-access-dn7s8\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930614 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-run-ovn\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f3920b2-d64c-4102-8d86-665e6ea60718-tmp\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-kubelet\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-run-openvswitch\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930711 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-run-ovn\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/51db2efc-3b63-4a21-bb04-99caab75c450-konnectivity-ca\") pod \"konnectivity-agent-sn554\" (UID: \"51db2efc-3b63-4a21-bb04-99caab75c450\") " pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930771 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-run\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930817 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-run\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-kubelet\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-os-release\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.930877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-run-openvswitch\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930930 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6t7cb\" (UniqueName: \"kubernetes.io/projected/14b15c97-09e7-4ee9-87b2-c15fc332baf7-kube-api-access-6t7cb\") pod \"iptables-alerter-4p4f7\" (UID: \"14b15c97-09e7-4ee9-87b2-c15fc332baf7\") " pod="openshift-network-operator/iptables-alerter-4p4f7" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/49a176d5-a780-4a38-b16f-90dc62742d5d-ovnkube-config\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.930975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-os-release\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931081 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/51db2efc-3b63-4a21-bb04-99caab75c450-agent-certs\") pod \"konnectivity-agent-sn554\" (UID: \"51db2efc-3b63-4a21-bb04-99caab75c450\") " pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931110 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-systemd\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-lib-modules\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/49a176d5-a780-4a38-b16f-90dc62742d5d-ovnkube-script-lib\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-host\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-systemd-units\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs4pm\" (UniqueName: \"kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm\") pod \"network-check-target-d4gnk\" (UID: \"f5fc0e79-0e28-4fcc-891b-18afdb313f11\") " pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931262 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/355b3a4d-4123-4e80-a76f-e42bcfb92020-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-sysctl-d\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-system-cni-dir\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-cnibin\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931336 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-systemd-units\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931353 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-systemd\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.931393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-sysctl-d\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-cni-netd\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14b15c97-09e7-4ee9-87b2-c15fc332baf7-host-slash\") pod \"iptables-alerter-4p4f7\" (UID: \"14b15c97-09e7-4ee9-87b2-c15fc332baf7\") " pod="openshift-network-operator/iptables-alerter-4p4f7" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-host\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-cni-bin\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931448 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-modprobe-d\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931497 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-sysconfig\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931526 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-cnibin\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14b15c97-09e7-4ee9-87b2-c15fc332baf7-host-slash\") pod \"iptables-alerter-4p4f7\" (UID: \"14b15c97-09e7-4ee9-87b2-c15fc332baf7\") " pod="openshift-network-operator/iptables-alerter-4p4f7" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931567 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-cni-bin\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-cni-netd\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-system-cni-dir\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-kubernetes\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-kubernetes\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-tuned\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-run-systemd\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931626 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-sysconfig\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-var-lib-openvswitch\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931695 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-sysctl-conf\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-run-systemd\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931696 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-modprobe-d\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-slash\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931776 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-log-socket\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-slash\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/51db2efc-3b63-4a21-bb04-99caab75c450-konnectivity-ca\") pod \"konnectivity-agent-sn554\" (UID: \"51db2efc-3b63-4a21-bb04-99caab75c450\") " pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-var-lib-openvswitch\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931900 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-run-ovn-kubernetes\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931911 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-sysctl-conf\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931927 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-log-socket\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.931984 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/49a176d5-a780-4a38-b16f-90dc62742d5d-env-overrides\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932009 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-lib-modules\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932019 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/49a176d5-a780-4a38-b16f-90dc62742d5d-ovnkube-config\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932054 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92tt8\" (UniqueName: \"kubernetes.io/projected/7f3920b2-d64c-4102-8d86-665e6ea60718-kube-api-access-92tt8\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932102 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/49a176d5-a780-4a38-b16f-90dc62742d5d-ovnkube-script-lib\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.932873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932198 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29wzd\" (UniqueName: \"kubernetes.io/projected/49a176d5-a780-4a38-b16f-90dc62742d5d-kube-api-access-29wzd\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932224 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14b15c97-09e7-4ee9-87b2-c15fc332baf7-iptables-alerter-script\") pod \"iptables-alerter-4p4f7\" (UID: \"14b15c97-09e7-4ee9-87b2-c15fc332baf7\") " pod="openshift-network-operator/iptables-alerter-4p4f7" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932272 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/355b3a4d-4123-4e80-a76f-e42bcfb92020-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932320 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-run-ovn-kubernetes\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932333 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/49a176d5-a780-4a38-b16f-90dc62742d5d-env-overrides\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.932388 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.932737 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs podName:bcb4d874-10b6-4167-b452-800ed19b3f79 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:24.432698185 +0000 UTC m=+2.154641166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs") pod "network-metrics-daemon-p9f9z" (UID: "bcb4d874-10b6-4167-b452-800ed19b3f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-var-lib-kubelet\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqwfv\" (UniqueName: \"kubernetes.io/projected/bcb4d874-10b6-4167-b452-800ed19b3f79-kube-api-access-gqwfv\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-run-netns\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/49a176d5-a780-4a38-b16f-90dc62742d5d-ovn-node-metrics-cert\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932872 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-host-run-netns\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14b15c97-09e7-4ee9-87b2-c15fc332baf7-iptables-alerter-script\") pod \"iptables-alerter-4p4f7\" (UID: \"14b15c97-09e7-4ee9-87b2-c15fc332baf7\") " pod="openshift-network-operator/iptables-alerter-4p4f7" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932872 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-var-lib-kubelet\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-sys\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f3920b2-d64c-4102-8d86-665e6ea60718-sys\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.933548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/355b3a4d-4123-4e80-a76f-e42bcfb92020-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.933986 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.932982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-etc-openvswitch\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.933986 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.933018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/355b3a4d-4123-4e80-a76f-e42bcfb92020-cni-binary-copy\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.933986 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.933045 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-etc-openvswitch\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.933986 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.933050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-node-log\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.933986 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.933086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/49a176d5-a780-4a38-b16f-90dc62742d5d-node-log\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.933986 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.933513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/355b3a4d-4123-4e80-a76f-e42bcfb92020-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.933986 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.933618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f3920b2-d64c-4102-8d86-665e6ea60718-tmp\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.933986 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.933819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/51db2efc-3b63-4a21-bb04-99caab75c450-agent-certs\") pod \"konnectivity-agent-sn554\" (UID: \"51db2efc-3b63-4a21-bb04-99caab75c450\") " pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:23.934214 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.934114 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/355b3a4d-4123-4e80-a76f-e42bcfb92020-cni-binary-copy\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:23.934214 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.934172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f3920b2-d64c-4102-8d86-665e6ea60718-etc-tuned\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.934809 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.934787 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/49a176d5-a780-4a38-b16f-90dc62742d5d-ovn-node-metrics-cert\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.940555 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.940522 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:23.940555 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.940542 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:23.940555 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.940555 2577 projected.go:194] Error preparing data for projected volume kube-api-access-vs4pm for pod openshift-network-diagnostics/network-check-target-d4gnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:23.940736 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:23.940628 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm podName:f5fc0e79-0e28-4fcc-891b-18afdb313f11 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:24.440609529 +0000 UTC m=+2.162552504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vs4pm" (UniqueName: "kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm") pod "network-check-target-d4gnk" (UID: "f5fc0e79-0e28-4fcc-891b-18afdb313f11") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:23.942654 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.942629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t7cb\" (UniqueName: \"kubernetes.io/projected/14b15c97-09e7-4ee9-87b2-c15fc332baf7-kube-api-access-6t7cb\") pod \"iptables-alerter-4p4f7\" (UID: \"14b15c97-09e7-4ee9-87b2-c15fc332baf7\") " pod="openshift-network-operator/iptables-alerter-4p4f7" Apr 17 17:26:23.942790 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.942770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqwfv\" (UniqueName: \"kubernetes.io/projected/bcb4d874-10b6-4167-b452-800ed19b3f79-kube-api-access-gqwfv\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:23.942964 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.942948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29wzd\" (UniqueName: \"kubernetes.io/projected/49a176d5-a780-4a38-b16f-90dc62742d5d-kube-api-access-29wzd\") pod \"ovnkube-node-8rjgx\" (UID: \"49a176d5-a780-4a38-b16f-90dc62742d5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:23.942999 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.942939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92tt8\" (UniqueName: \"kubernetes.io/projected/7f3920b2-d64c-4102-8d86-665e6ea60718-kube-api-access-92tt8\") pod \"tuned-k2ngp\" (UID: \"7f3920b2-d64c-4102-8d86-665e6ea60718\") " pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:23.943129 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:23.943113 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn7s8\" (UniqueName: \"kubernetes.io/projected/355b3a4d-4123-4e80-a76f-e42bcfb92020-kube-api-access-dn7s8\") pod \"multus-additional-cni-plugins-pdpfh\" (UID: \"355b3a4d-4123-4e80-a76f-e42bcfb92020\") " pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:24.037706 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.037614 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" Apr 17 17:26:24.044204 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:24.044173 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf0d80e7_5925_429c_8bd3_f0235981720a.slice/crio-1af2d42617e7cc5d7920d94b26addd932e660fbd8dbfea30f6d6d569d578bb80 WatchSource:0}: Error finding container 1af2d42617e7cc5d7920d94b26addd932e660fbd8dbfea30f6d6d569d578bb80: Status 404 returned error can't find the container with id 1af2d42617e7cc5d7920d94b26addd932e660fbd8dbfea30f6d6d569d578bb80 Apr 17 17:26:24.068615 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.068581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bjxzb" Apr 17 17:26:24.074711 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:24.074683 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca904b14_b665_4107_bf21_c1783df952e4.slice/crio-f3e69bceb77f47e29ef6349c2bc863ed134c6005cb4ed9a15fd9ab58ebdedf77 WatchSource:0}: Error finding container f3e69bceb77f47e29ef6349c2bc863ed134c6005cb4ed9a15fd9ab58ebdedf77: Status 404 returned error can't find the container with id f3e69bceb77f47e29ef6349c2bc863ed134c6005cb4ed9a15fd9ab58ebdedf77 Apr 17 17:26:24.084300 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.084277 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gb2kr" Apr 17 17:26:24.088131 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.088109 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6mktk"] Apr 17 17:26:24.089914 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:24.089889 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod255274fc_6f71_45da_a2f9_c715044eee61.slice/crio-e463013021f03f2d29909df112c6d2b96040062978557957399410326da76e11 WatchSource:0}: Error finding container e463013021f03f2d29909df112c6d2b96040062978557957399410326da76e11: Status 404 returned error can't find the container with id e463013021f03f2d29909df112c6d2b96040062978557957399410326da76e11 Apr 17 17:26:24.092576 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.092555 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:24.092680 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:24.092640 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:24.097875 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.097859 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rks6d" Apr 17 17:26:24.103954 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:24.103932 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc447c5b4_4c37_4d50_8c77_2633c36d977d.slice/crio-4bb985845f1466be89a5d882bd80af7b4fedf157780e5bed9aea1be380745f35 WatchSource:0}: Error finding container 4bb985845f1466be89a5d882bd80af7b4fedf157780e5bed9aea1be380745f35: Status 404 returned error can't find the container with id 4bb985845f1466be89a5d882bd80af7b4fedf157780e5bed9aea1be380745f35 Apr 17 17:26:24.112554 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.112534 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:24.118867 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:24.118828 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51db2efc_3b63_4a21_bb04_99caab75c450.slice/crio-f06071de53ff6cd46dfe3fac787bc0a796d00dda02a2c46d568f7b2b284bb3f5 WatchSource:0}: Error finding container f06071de53ff6cd46dfe3fac787bc0a796d00dda02a2c46d568f7b2b284bb3f5: Status 404 returned error can't find the container with id f06071de53ff6cd46dfe3fac787bc0a796d00dda02a2c46d568f7b2b284bb3f5 Apr 17 17:26:24.133450 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.133420 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" Apr 17 17:26:24.134533 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.134449 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/74142d91-eb23-411d-8c68-16c329d30680-kubelet-config\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:24.134533 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.134492 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/74142d91-eb23-411d-8c68-16c329d30680-dbus\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:24.134600 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.134528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:24.140461 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:24.140391 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f3920b2_d64c_4102_8d86_665e6ea60718.slice/crio-d8fdcda375c766a8c0c6b49cae987dd972b110aa6599bad10a46ae0a500f1a6f WatchSource:0}: Error finding container d8fdcda375c766a8c0c6b49cae987dd972b110aa6599bad10a46ae0a500f1a6f: Status 404 returned error can't find the container with id d8fdcda375c766a8c0c6b49cae987dd972b110aa6599bad10a46ae0a500f1a6f Apr 17 17:26:24.154147 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.154127 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" Apr 17 17:26:24.160505 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:24.160482 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod355b3a4d_4123_4e80_a76f_e42bcfb92020.slice/crio-dc00dee704a32a1ba341ec1a297e24bd45ea0f943d1e187e549376bc8a071033 WatchSource:0}: Error finding container dc00dee704a32a1ba341ec1a297e24bd45ea0f943d1e187e549376bc8a071033: Status 404 returned error can't find the container with id dc00dee704a32a1ba341ec1a297e24bd45ea0f943d1e187e549376bc8a071033 Apr 17 17:26:24.160597 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.160520 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4p4f7" Apr 17 17:26:24.165960 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.165940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:24.166148 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:24.166125 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14b15c97_09e7_4ee9_87b2_c15fc332baf7.slice/crio-9b015a0aac2fe29feb82b2d63514ec30a6c385ab033ee808332fcd7efa1f1328 WatchSource:0}: Error finding container 9b015a0aac2fe29feb82b2d63514ec30a6c385ab033ee808332fcd7efa1f1328: Status 404 returned error can't find the container with id 9b015a0aac2fe29feb82b2d63514ec30a6c385ab033ee808332fcd7efa1f1328 Apr 17 17:26:24.171306 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:24.171281 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49a176d5_a780_4a38_b16f_90dc62742d5d.slice/crio-abca2042cf885f1bb13aa508cf67f763614ca30f027b1c26e47feaf7491d5aeb WatchSource:0}: Error finding container abca2042cf885f1bb13aa508cf67f763614ca30f027b1c26e47feaf7491d5aeb: Status 404 returned error can't find the container with id abca2042cf885f1bb13aa508cf67f763614ca30f027b1c26e47feaf7491d5aeb Apr 17 17:26:24.235709 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.235675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/74142d91-eb23-411d-8c68-16c329d30680-kubelet-config\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:24.235709 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.235712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/74142d91-eb23-411d-8c68-16c329d30680-dbus\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:24.235925 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.235732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:24.235925 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.235831 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/74142d91-eb23-411d-8c68-16c329d30680-kubelet-config\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:24.235925 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:24.235863 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:24.235925 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.235874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/74142d91-eb23-411d-8c68-16c329d30680-dbus\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:24.236063 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:24.235948 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret podName:74142d91-eb23-411d-8c68-16c329d30680 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:24.735924241 +0000 UTC m=+2.457867237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret") pod "global-pull-secret-syncer-6mktk" (UID: "74142d91-eb23-411d-8c68-16c329d30680") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:24.245163 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.245142 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:24.437808 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.437725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:24.437981 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:24.437917 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:24.438041 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:24.438017 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs podName:bcb4d874-10b6-4167-b452-800ed19b3f79 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:25.437968306 +0000 UTC m=+3.159911301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs") pod "network-metrics-daemon-p9f9z" (UID: "bcb4d874-10b6-4167-b452-800ed19b3f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:24.543054 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.538243 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs4pm\" (UniqueName: \"kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm\") pod \"network-check-target-d4gnk\" (UID: \"f5fc0e79-0e28-4fcc-891b-18afdb313f11\") " pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:24.543054 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:24.538464 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:24.543054 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:24.538501 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:24.543054 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:24.538514 2577 projected.go:194] Error preparing data for projected volume kube-api-access-vs4pm for pod openshift-network-diagnostics/network-check-target-d4gnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:24.543054 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:24.538581 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm podName:f5fc0e79-0e28-4fcc-891b-18afdb313f11 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:25.538552995 +0000 UTC m=+3.260495975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vs4pm" (UniqueName: "kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm") pod "network-check-target-d4gnk" (UID: "f5fc0e79-0e28-4fcc-891b-18afdb313f11") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:24.742533 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.740183 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:24.742533 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:24.740410 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:24.742533 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:24.740520 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret podName:74142d91-eb23-411d-8c68-16c329d30680 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:25.740501263 +0000 UTC m=+3.462444240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret") pod "global-pull-secret-syncer-6mktk" (UID: "74142d91-eb23-411d-8c68-16c329d30680") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:24.770293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.770244 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:21:23 +0000 UTC" deadline="2027-10-20 20:54:55.70695691 +0000 UTC" Apr 17 17:26:24.770293 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.770289 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13227h28m30.936671789s" Apr 17 17:26:24.872329 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.872267 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gb2kr" event={"ID":"255274fc-6f71-45da-a2f9-c715044eee61","Type":"ContainerStarted","Data":"e463013021f03f2d29909df112c6d2b96040062978557957399410326da76e11"} Apr 17 17:26:24.891754 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.891707 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" event={"ID":"6318a6bfadd7dc1dc0c05b611179f194","Type":"ContainerStarted","Data":"1e0415c7ab4a28a5b31126d623ea7491557d0ce87592d384ce794adf75856e5a"} Apr 17 17:26:24.917985 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.917921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" event={"ID":"49a176d5-a780-4a38-b16f-90dc62742d5d","Type":"ContainerStarted","Data":"abca2042cf885f1bb13aa508cf67f763614ca30f027b1c26e47feaf7491d5aeb"} Apr 17 17:26:24.943927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.943887 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" event={"ID":"355b3a4d-4123-4e80-a76f-e42bcfb92020","Type":"ContainerStarted","Data":"dc00dee704a32a1ba341ec1a297e24bd45ea0f943d1e187e549376bc8a071033"} Apr 17 17:26:24.958163 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.958046 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bjxzb" event={"ID":"ca904b14-b665-4107-bf21-c1783df952e4","Type":"ContainerStarted","Data":"f3e69bceb77f47e29ef6349c2bc863ed134c6005cb4ed9a15fd9ab58ebdedf77"} Apr 17 17:26:24.973505 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.973449 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" event={"ID":"af0d80e7-5925-429c-8bd3-f0235981720a","Type":"ContainerStarted","Data":"1af2d42617e7cc5d7920d94b26addd932e660fbd8dbfea30f6d6d569d578bb80"} Apr 17 17:26:24.978224 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.978190 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" event={"ID":"581fa9c88cb33b3a66c7bdd6f4dd1862","Type":"ContainerStarted","Data":"86979263ca58b93a88db714c52ebd85bb2793d6813211e07eb5da8e4a665c792"} Apr 17 17:26:24.987797 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:24.987754 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4p4f7" event={"ID":"14b15c97-09e7-4ee9-87b2-c15fc332baf7","Type":"ContainerStarted","Data":"9b015a0aac2fe29feb82b2d63514ec30a6c385ab033ee808332fcd7efa1f1328"} Apr 17 17:26:25.006943 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.006855 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" event={"ID":"7f3920b2-d64c-4102-8d86-665e6ea60718","Type":"ContainerStarted","Data":"d8fdcda375c766a8c0c6b49cae987dd972b110aa6599bad10a46ae0a500f1a6f"} Apr 17 17:26:25.023885 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.023788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sn554" event={"ID":"51db2efc-3b63-4a21-bb04-99caab75c450","Type":"ContainerStarted","Data":"f06071de53ff6cd46dfe3fac787bc0a796d00dda02a2c46d568f7b2b284bb3f5"} Apr 17 17:26:25.051225 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.051183 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rks6d" event={"ID":"c447c5b4-4c37-4d50-8c77-2633c36d977d","Type":"ContainerStarted","Data":"4bb985845f1466be89a5d882bd80af7b4fedf157780e5bed9aea1be380745f35"} Apr 17 17:26:25.058980 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.058947 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:25.135049 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.135019 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:25.445737 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.445702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:25.445934 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:25.445837 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:25.445934 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:25.445899 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs podName:bcb4d874-10b6-4167-b452-800ed19b3f79 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:27.445882624 +0000 UTC m=+5.167825599 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs") pod "network-metrics-daemon-p9f9z" (UID: "bcb4d874-10b6-4167-b452-800ed19b3f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:25.546816 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.546770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs4pm\" (UniqueName: \"kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm\") pod \"network-check-target-d4gnk\" (UID: \"f5fc0e79-0e28-4fcc-891b-18afdb313f11\") " pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:25.547006 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:25.546980 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:25.547006 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:25.546998 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:25.547113 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:25.547011 2577 projected.go:194] Error preparing data for projected volume kube-api-access-vs4pm for pod openshift-network-diagnostics/network-check-target-d4gnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:25.547113 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:25.547070 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm podName:f5fc0e79-0e28-4fcc-891b-18afdb313f11 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:27.547050425 +0000 UTC m=+5.268993419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vs4pm" (UniqueName: "kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm") pod "network-check-target-d4gnk" (UID: "f5fc0e79-0e28-4fcc-891b-18afdb313f11") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:25.749017 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.748881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:25.749450 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:25.749062 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:25.749450 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:25.749156 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret podName:74142d91-eb23-411d-8c68-16c329d30680 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:27.749136067 +0000 UTC m=+5.471079057 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret") pod "global-pull-secret-syncer-6mktk" (UID: "74142d91-eb23-411d-8c68-16c329d30680") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:25.770914 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.770867 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:21:23 +0000 UTC" deadline="2027-12-11 20:41:17.394627381 +0000 UTC" Apr 17 17:26:25.770914 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.770912 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14475h14m51.623722665s" Apr 17 17:26:25.850651 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.850619 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:25.850651 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.850639 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:25.850921 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:25.850758 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:25.850921 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:25.850827 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:25.850921 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:25.850891 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:25.851091 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:25.850958 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:27.464157 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:27.463532 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:27.464157 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:27.463718 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:27.464157 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:27.463786 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs podName:bcb4d874-10b6-4167-b452-800ed19b3f79 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:31.4637676 +0000 UTC m=+9.185710583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs") pod "network-metrics-daemon-p9f9z" (UID: "bcb4d874-10b6-4167-b452-800ed19b3f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:27.564312 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:27.564237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs4pm\" (UniqueName: \"kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm\") pod \"network-check-target-d4gnk\" (UID: \"f5fc0e79-0e28-4fcc-891b-18afdb313f11\") " pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:27.564563 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:27.564418 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:27.564563 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:27.564440 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:27.564563 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:27.564453 2577 projected.go:194] Error preparing data for projected volume kube-api-access-vs4pm for pod openshift-network-diagnostics/network-check-target-d4gnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:27.564563 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:27.564534 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm podName:f5fc0e79-0e28-4fcc-891b-18afdb313f11 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:31.564513853 +0000 UTC m=+9.286456830 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vs4pm" (UniqueName: "kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm") pod "network-check-target-d4gnk" (UID: "f5fc0e79-0e28-4fcc-891b-18afdb313f11") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:27.765616 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:27.765528 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:27.765772 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:27.765671 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:27.765772 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:27.765727 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret podName:74142d91-eb23-411d-8c68-16c329d30680 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:31.765707791 +0000 UTC m=+9.487650768 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret") pod "global-pull-secret-syncer-6mktk" (UID: "74142d91-eb23-411d-8c68-16c329d30680") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:27.850608 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:27.850570 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:27.850790 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:27.850753 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:27.851097 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:27.851073 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:27.851202 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:27.851153 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:27.851202 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:27.851190 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:27.851292 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:27.851237 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:29.851002 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:29.850960 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:29.851531 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:29.851009 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:29.851531 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:29.850970 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:29.851531 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:29.851092 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:29.851531 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:29.851238 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:29.851531 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:29.851332 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:31.497017 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:31.496868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:31.497438 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:31.497028 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:31.497438 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:31.497093 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs podName:bcb4d874-10b6-4167-b452-800ed19b3f79 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:39.497074431 +0000 UTC m=+17.219017411 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs") pod "network-metrics-daemon-p9f9z" (UID: "bcb4d874-10b6-4167-b452-800ed19b3f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:31.598106 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:31.598043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs4pm\" (UniqueName: \"kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm\") pod \"network-check-target-d4gnk\" (UID: \"f5fc0e79-0e28-4fcc-891b-18afdb313f11\") " pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:31.598254 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:31.598196 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:31.598254 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:31.598224 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:31.598254 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:31.598238 2577 projected.go:194] Error preparing data for projected volume kube-api-access-vs4pm for pod openshift-network-diagnostics/network-check-target-d4gnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:31.598392 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:31.598305 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm podName:f5fc0e79-0e28-4fcc-891b-18afdb313f11 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:39.598284032 +0000 UTC m=+17.320227011 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vs4pm" (UniqueName: "kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm") pod "network-check-target-d4gnk" (UID: "f5fc0e79-0e28-4fcc-891b-18afdb313f11") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:31.800424 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:31.800329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:31.800609 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:31.800541 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:31.800609 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:31.800606 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret podName:74142d91-eb23-411d-8c68-16c329d30680 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:39.800588179 +0000 UTC m=+17.522531161 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret") pod "global-pull-secret-syncer-6mktk" (UID: "74142d91-eb23-411d-8c68-16c329d30680") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:31.850621 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:31.850540 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:31.850621 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:31.850582 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:31.850621 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:31.850582 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:31.850932 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:31.850673 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:31.850932 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:31.850766 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:31.850932 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:31.850876 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:33.850732 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:33.850691 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:33.851232 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:33.850692 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:33.851232 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:33.850691 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:33.851232 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:33.850979 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:33.851232 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:33.850825 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:33.851232 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:33.851077 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:35.851274 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:35.851231 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:35.851274 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:35.851285 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:35.851851 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:35.851396 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:35.851851 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:35.851436 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:35.851851 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:35.851540 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:35.851851 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:35.851607 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:37.851059 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:37.851019 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:37.851527 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:37.851072 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:37.851527 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:37.851158 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:37.851527 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:37.851213 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:37.851527 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:37.851259 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:37.851527 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:37.851399 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:39.558025 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:39.557987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:39.558490 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:39.558110 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:39.558490 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:39.558172 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs podName:bcb4d874-10b6-4167-b452-800ed19b3f79 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:55.558152592 +0000 UTC m=+33.280095568 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs") pod "network-metrics-daemon-p9f9z" (UID: "bcb4d874-10b6-4167-b452-800ed19b3f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:39.659286 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:39.659246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs4pm\" (UniqueName: \"kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm\") pod \"network-check-target-d4gnk\" (UID: \"f5fc0e79-0e28-4fcc-891b-18afdb313f11\") " pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:39.659456 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:39.659420 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:39.659456 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:39.659448 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:39.659561 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:39.659462 2577 projected.go:194] Error preparing data for projected volume kube-api-access-vs4pm for pod openshift-network-diagnostics/network-check-target-d4gnk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:39.659561 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:39.659540 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm podName:f5fc0e79-0e28-4fcc-891b-18afdb313f11 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:55.659515724 +0000 UTC m=+33.381458711 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vs4pm" (UniqueName: "kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm") pod "network-check-target-d4gnk" (UID: "f5fc0e79-0e28-4fcc-891b-18afdb313f11") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:39.850437 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:39.850343 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:39.850618 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:39.850463 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:39.850618 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:39.850486 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:39.850618 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:39.850601 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:39.850789 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:39.850654 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:39.850789 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:39.850738 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:39.861053 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:39.861026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:39.861226 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:39.861207 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:39.861298 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:39.861287 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret podName:74142d91-eb23-411d-8c68-16c329d30680 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:55.861267118 +0000 UTC m=+33.583210098 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret") pod "global-pull-secret-syncer-6mktk" (UID: "74142d91-eb23-411d-8c68-16c329d30680") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:41.850732 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:41.850709 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:41.851163 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:41.850758 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:41.851163 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:41.850776 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:41.851163 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:41.850849 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:41.851163 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:41.850933 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:41.851163 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:41.850992 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:42.086558 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.086286 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" event={"ID":"7f3920b2-d64c-4102-8d86-665e6ea60718","Type":"ContainerStarted","Data":"f75e13719b63420e12e230c464a122adea975d65b27042e341d688bc38404751"} Apr 17 17:26:42.088658 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.088627 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rks6d" event={"ID":"c447c5b4-4c37-4d50-8c77-2633c36d977d","Type":"ContainerStarted","Data":"292b02349424d692cb966937f3c50b47eb2f25201e3ee6bccc3e7ed306872f14"} Apr 17 17:26:42.091743 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.091721 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:26:42.092092 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.092053 2577 generic.go:358] "Generic (PLEG): container finished" podID="49a176d5-a780-4a38-b16f-90dc62742d5d" containerID="77a976c8bb126dfb712a5606166dac28eadaccec2b5e147e2940dedc8b7b4685" exitCode=1 Apr 17 17:26:42.092164 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.092149 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" event={"ID":"49a176d5-a780-4a38-b16f-90dc62742d5d","Type":"ContainerStarted","Data":"95e692ba85395bba3f65a296620c4bbfd32b1900a43005d36716ac7e0aa5cdf2"} Apr 17 17:26:42.092213 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.092170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" event={"ID":"49a176d5-a780-4a38-b16f-90dc62742d5d","Type":"ContainerStarted","Data":"e2e71c620ddd6ed7913013e133624c7bbd8ffb09cf921aa83f95dcfe3b0a56c8"} Apr 17 17:26:42.092213 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.092184 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" event={"ID":"49a176d5-a780-4a38-b16f-90dc62742d5d","Type":"ContainerDied","Data":"77a976c8bb126dfb712a5606166dac28eadaccec2b5e147e2940dedc8b7b4685"} Apr 17 17:26:42.092213 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.092199 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" event={"ID":"49a176d5-a780-4a38-b16f-90dc62742d5d","Type":"ContainerStarted","Data":"f4b2df5a785d9a2f3704b1a673f7f1fe821fd307910a9a790eb579a21733c230"} Apr 17 17:26:42.094077 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.094053 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" event={"ID":"581fa9c88cb33b3a66c7bdd6f4dd1862","Type":"ContainerStarted","Data":"a2a3ef32ac99cd641fea2fd1cc00e1bbde0e902d28c7d14e44266bd1c754b5d1"} Apr 17 17:26:42.131634 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.131548 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-k2ngp" podStartSLOduration=2.671062078 podStartE2EDuration="20.131526324s" podCreationTimestamp="2026-04-17 17:26:22 +0000 UTC" firstStartedPulling="2026-04-17 17:26:24.141790917 +0000 UTC m=+1.863733893" lastFinishedPulling="2026-04-17 17:26:41.602255163 +0000 UTC m=+19.324198139" observedRunningTime="2026-04-17 17:26:42.111863735 +0000 UTC m=+19.833806761" watchObservedRunningTime="2026-04-17 17:26:42.131526324 +0000 UTC m=+19.853469324" Apr 17 17:26:42.158211 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.158156 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-96.ec2.internal" podStartSLOduration=19.158139532 podStartE2EDuration="19.158139532s" podCreationTimestamp="2026-04-17 17:26:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:42.131422775 +0000 UTC m=+19.853365785" watchObservedRunningTime="2026-04-17 17:26:42.158139532 +0000 UTC m=+19.880082531" Apr 17 17:26:42.158396 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:42.158367 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rks6d" podStartSLOduration=2.660480443 podStartE2EDuration="20.158360528s" podCreationTimestamp="2026-04-17 17:26:22 +0000 UTC" firstStartedPulling="2026-04-17 17:26:24.105326288 +0000 UTC m=+1.827269263" lastFinishedPulling="2026-04-17 17:26:41.603206369 +0000 UTC m=+19.325149348" observedRunningTime="2026-04-17 17:26:42.157498023 +0000 UTC m=+19.879441020" watchObservedRunningTime="2026-04-17 17:26:42.158360528 +0000 UTC m=+19.880303523" Apr 17 17:26:43.099835 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.099643 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:26:43.100716 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.100687 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" event={"ID":"49a176d5-a780-4a38-b16f-90dc62742d5d","Type":"ContainerStarted","Data":"184d871ae960aaa7ff664ca8b761489dcfd41828a67b6321455d71f725e2d5ee"} Apr 17 17:26:43.100817 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.100728 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" event={"ID":"49a176d5-a780-4a38-b16f-90dc62742d5d","Type":"ContainerStarted","Data":"41fb4f25ca5780ba2cbaee947921fed418690a70d33e4da8e799c2432ada3549"} Apr 17 17:26:43.102058 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.102028 2577 generic.go:358] "Generic (PLEG): container finished" podID="355b3a4d-4123-4e80-a76f-e42bcfb92020" containerID="614bd4e4195ee5097765bb263076cab553e3e4439e965f71ace4ea5042b0eb1a" exitCode=0 Apr 17 17:26:43.102158 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.102115 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" event={"ID":"355b3a4d-4123-4e80-a76f-e42bcfb92020","Type":"ContainerDied","Data":"614bd4e4195ee5097765bb263076cab553e3e4439e965f71ace4ea5042b0eb1a"} Apr 17 17:26:43.103588 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.103551 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bjxzb" event={"ID":"ca904b14-b665-4107-bf21-c1783df952e4","Type":"ContainerStarted","Data":"1d068af7d335a9de485259f3a492e48255e8274518184e519896fdff9a228664"} Apr 17 17:26:43.104939 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.104903 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" event={"ID":"af0d80e7-5925-429c-8bd3-f0235981720a","Type":"ContainerStarted","Data":"610b1d42e7a826e60c451572a13f4b3a819d2451aee0964a4aec51ba054c393d"} Apr 17 17:26:43.106268 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.106246 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4p4f7" event={"ID":"14b15c97-09e7-4ee9-87b2-c15fc332baf7","Type":"ContainerStarted","Data":"cad55c27d423993b5d7cbe9f60bc381616203ded6f4f80935484a3c677912b36"} Apr 17 17:26:43.107588 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.107558 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sn554" event={"ID":"51db2efc-3b63-4a21-bb04-99caab75c450","Type":"ContainerStarted","Data":"62f917634dc228c2ff4462da41128ce007afe4c7854b5a15b8281a07bef408a1"} Apr 17 17:26:43.109058 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.109034 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gb2kr" event={"ID":"255274fc-6f71-45da-a2f9-c715044eee61","Type":"ContainerStarted","Data":"a0f8979d169b7cfb14ce0054ef581c626f71716256099e08eba09d0c2ecafd91"} Apr 17 17:26:43.110527 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.110502 2577 generic.go:358] "Generic (PLEG): container finished" podID="6318a6bfadd7dc1dc0c05b611179f194" containerID="eb844e43f851ea28d9fae4a2aa10bc0e8a86f08fb29aa99776c7c067555c95f2" exitCode=0 Apr 17 17:26:43.110623 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.110607 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" event={"ID":"6318a6bfadd7dc1dc0c05b611179f194","Type":"ContainerDied","Data":"eb844e43f851ea28d9fae4a2aa10bc0e8a86f08fb29aa99776c7c067555c95f2"} Apr 17 17:26:43.177444 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.177387 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-sn554" podStartSLOduration=3.73096907 podStartE2EDuration="21.177369681s" podCreationTimestamp="2026-04-17 17:26:22 +0000 UTC" firstStartedPulling="2026-04-17 17:26:24.120249216 +0000 UTC m=+1.842192192" lastFinishedPulling="2026-04-17 17:26:41.56664982 +0000 UTC m=+19.288592803" observedRunningTime="2026-04-17 17:26:43.159847682 +0000 UTC m=+20.881790681" watchObservedRunningTime="2026-04-17 17:26:43.177369681 +0000 UTC m=+20.899312680" Apr 17 17:26:43.177739 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.177703 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4p4f7" podStartSLOduration=3.761392665 podStartE2EDuration="21.177692268s" podCreationTimestamp="2026-04-17 17:26:22 +0000 UTC" firstStartedPulling="2026-04-17 17:26:24.168483314 +0000 UTC m=+1.890426289" lastFinishedPulling="2026-04-17 17:26:41.584782903 +0000 UTC m=+19.306725892" observedRunningTime="2026-04-17 17:26:43.176813406 +0000 UTC m=+20.898756405" watchObservedRunningTime="2026-04-17 17:26:43.177692268 +0000 UTC m=+20.899635264" Apr 17 17:26:43.209390 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.209330 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bjxzb" podStartSLOduration=3.6838515640000002 podStartE2EDuration="21.209313093s" podCreationTimestamp="2026-04-17 17:26:22 +0000 UTC" firstStartedPulling="2026-04-17 17:26:24.076670769 +0000 UTC m=+1.798613744" lastFinishedPulling="2026-04-17 17:26:41.602132298 +0000 UTC m=+19.324075273" observedRunningTime="2026-04-17 17:26:43.208851521 +0000 UTC m=+20.930794517" watchObservedRunningTime="2026-04-17 17:26:43.209313093 +0000 UTC m=+20.931256089" Apr 17 17:26:43.216169 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.216062 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:26:43.225863 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.225816 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gb2kr" podStartSLOduration=3.767274745 podStartE2EDuration="21.225801823s" podCreationTimestamp="2026-04-17 17:26:22 +0000 UTC" firstStartedPulling="2026-04-17 17:26:24.091430345 +0000 UTC m=+1.813373324" lastFinishedPulling="2026-04-17 17:26:41.549957411 +0000 UTC m=+19.271900402" observedRunningTime="2026-04-17 17:26:43.225634874 +0000 UTC m=+20.947577870" watchObservedRunningTime="2026-04-17 17:26:43.225801823 +0000 UTC m=+20.947744818" Apr 17 17:26:43.795850 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.795732 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:26:43.216081031Z","UUID":"00b129a8-00a6-4ff9-a2e9-d7cbde243fca","Handler":null,"Name":"","Endpoint":""} Apr 17 17:26:43.798563 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.798535 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:26:43.798563 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.798571 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:26:43.850751 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.850712 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:43.850974 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.850712 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:43.850974 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:43.850902 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:43.850974 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:43.850737 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:43.851132 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:43.850983 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:43.851132 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:43.851046 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:44.114990 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:44.114946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" event={"ID":"af0d80e7-5925-429c-8bd3-f0235981720a","Type":"ContainerStarted","Data":"d178d04695b0e2bcd3c48fc6fb945825f7a6d2508d084a16667c643d9367b268"} Apr 17 17:26:44.114990 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:44.114996 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" event={"ID":"af0d80e7-5925-429c-8bd3-f0235981720a","Type":"ContainerStarted","Data":"29e322a2d132588cfaafa38c30c8ce36bf35aea0bc3e79a303449934875d9b3e"} Apr 17 17:26:44.116873 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:44.116820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" event={"ID":"6318a6bfadd7dc1dc0c05b611179f194","Type":"ContainerStarted","Data":"de2c9bd768ce34266d32880b025abe19bf964e39a18811b3c8606d254851a8c7"} Apr 17 17:26:44.131988 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:44.131929 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sjxlk" podStartSLOduration=2.207562566 podStartE2EDuration="22.131911654s" podCreationTimestamp="2026-04-17 17:26:22 +0000 UTC" firstStartedPulling="2026-04-17 17:26:24.045562294 +0000 UTC m=+1.767505270" lastFinishedPulling="2026-04-17 17:26:43.96991137 +0000 UTC m=+21.691854358" observedRunningTime="2026-04-17 17:26:44.131174664 +0000 UTC m=+21.853117665" watchObservedRunningTime="2026-04-17 17:26:44.131911654 +0000 UTC m=+21.853854718" Apr 17 17:26:44.141934 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:44.141894 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:44.145738 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:44.145698 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-96.ec2.internal" podStartSLOduration=21.14568282 podStartE2EDuration="21.14568282s" podCreationTimestamp="2026-04-17 17:26:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:44.145177182 +0000 UTC m=+21.867120179" watchObservedRunningTime="2026-04-17 17:26:44.14568282 +0000 UTC m=+21.867625816" Apr 17 17:26:45.121214 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:45.121186 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:26:45.121678 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:45.121464 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" event={"ID":"49a176d5-a780-4a38-b16f-90dc62742d5d","Type":"ContainerStarted","Data":"f1359bf5dc1faf49c93714096461d4e60b7de4d84a26ce759bb4af512e2174fb"} Apr 17 17:26:45.851434 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:45.851399 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:45.851651 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:45.851445 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:45.851651 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:45.851508 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:45.851651 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:45.851631 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:45.851817 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:45.851697 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:45.851817 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:45.851774 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:46.022753 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:46.022690 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:46.023328 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:46.023301 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:46.124031 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:46.123945 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-sn554" Apr 17 17:26:47.128262 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:47.128233 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:26:47.128945 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:47.128596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" event={"ID":"49a176d5-a780-4a38-b16f-90dc62742d5d","Type":"ContainerStarted","Data":"552d871da45e38b495e989f383a6ff928f16b9f13d6f07e684ca0de6441b2d2f"} Apr 17 17:26:47.129155 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:47.129137 2577 scope.go:117] "RemoveContainer" containerID="77a976c8bb126dfb712a5606166dac28eadaccec2b5e147e2940dedc8b7b4685" Apr 17 17:26:47.851219 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:47.851184 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:47.851219 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:47.851208 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:47.851433 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:47.851369 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:47.851433 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:47.851421 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:47.851551 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:47.851486 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:47.851588 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:47.851560 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:48.639910 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:48.639651 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d4gnk"] Apr 17 17:26:48.640416 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:48.640061 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:48.640416 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:48.640183 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:48.640589 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:48.640570 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p9f9z"] Apr 17 17:26:48.640701 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:48.640688 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:48.640787 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:48.640770 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:48.642818 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:48.642685 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6mktk"] Apr 17 17:26:48.642818 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:48.642798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:48.642968 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:48.642895 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:49.136406 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:49.136331 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:26:49.136674 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:49.136651 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" event={"ID":"49a176d5-a780-4a38-b16f-90dc62742d5d","Type":"ContainerStarted","Data":"8968939ed3d03ca96b6c15643e486c4dee6c003155612ee1c3ba53f75776ab88"} Apr 17 17:26:49.136994 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:49.136928 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:49.136994 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:49.136962 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:49.136994 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:49.136975 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:49.138438 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:49.138411 2577 generic.go:358] "Generic (PLEG): container finished" podID="355b3a4d-4123-4e80-a76f-e42bcfb92020" containerID="5988c8c0974ac22def640bf47a6001b4c848eaf1e30491a9b5d6a016a6283001" exitCode=0 Apr 17 17:26:49.138587 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:49.138463 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" event={"ID":"355b3a4d-4123-4e80-a76f-e42bcfb92020","Type":"ContainerDied","Data":"5988c8c0974ac22def640bf47a6001b4c848eaf1e30491a9b5d6a016a6283001"} Apr 17 17:26:49.151418 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:49.151395 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:49.151571 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:49.151556 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:26:49.162658 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:49.162619 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" podStartSLOduration=9.555995719 podStartE2EDuration="27.162605616s" podCreationTimestamp="2026-04-17 17:26:22 +0000 UTC" firstStartedPulling="2026-04-17 17:26:24.172650624 +0000 UTC m=+1.894593600" lastFinishedPulling="2026-04-17 17:26:41.779260521 +0000 UTC m=+19.501203497" observedRunningTime="2026-04-17 17:26:49.161884267 +0000 UTC m=+26.883827264" watchObservedRunningTime="2026-04-17 17:26:49.162605616 +0000 UTC m=+26.884548613" Apr 17 17:26:50.850681 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:50.850647 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:50.851161 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:50.850652 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:50.851161 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:50.850757 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:50.851161 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:50.850648 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:50.851161 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:50.850885 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:50.851161 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:50.851006 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:51.144137 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:51.144048 2577 generic.go:358] "Generic (PLEG): container finished" podID="355b3a4d-4123-4e80-a76f-e42bcfb92020" containerID="81bf8ac2b8df0df75c2400a0ff20bc8ad06c03b18cfaf0c0b6728484ac72d4b6" exitCode=0 Apr 17 17:26:51.144309 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:51.144131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" event={"ID":"355b3a4d-4123-4e80-a76f-e42bcfb92020","Type":"ContainerDied","Data":"81bf8ac2b8df0df75c2400a0ff20bc8ad06c03b18cfaf0c0b6728484ac72d4b6"} Apr 17 17:26:51.158219 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:51.158156 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" podUID="49a176d5-a780-4a38-b16f-90dc62742d5d" containerName="ovnkube-controller" probeResult="failure" output="" Apr 17 17:26:52.851593 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:52.851417 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:52.852022 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:52.851504 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:52.852022 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:52.851756 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4gnk" podUID="f5fc0e79-0e28-4fcc-891b-18afdb313f11" Apr 17 17:26:52.852022 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:52.851652 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:26:52.852022 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:52.851522 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:52.852022 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:52.851862 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6mktk" podUID="74142d91-eb23-411d-8c68-16c329d30680" Apr 17 17:26:53.150118 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:53.150036 2577 generic.go:358] "Generic (PLEG): container finished" podID="355b3a4d-4123-4e80-a76f-e42bcfb92020" containerID="89cba605719549fd28326f00220a4283a39e53fd2d056eab76ea786317166f85" exitCode=0 Apr 17 17:26:53.150118 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:53.150091 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" event={"ID":"355b3a4d-4123-4e80-a76f-e42bcfb92020","Type":"ContainerDied","Data":"89cba605719549fd28326f00220a4283a39e53fd2d056eab76ea786317166f85"} Apr 17 17:26:54.593959 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.593916 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-96.ec2.internal" event="NodeReady" Apr 17 17:26:54.594406 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.594068 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:26:54.657188 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.657113 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-55f58"] Apr 17 17:26:54.678374 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.678349 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9lmqr"] Apr 17 17:26:54.678529 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.678512 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.682007 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.681980 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:26:54.682007 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.681985 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:26:54.682172 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.681986 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dxpw7\"" Apr 17 17:26:54.693506 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.693460 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-55f58"] Apr 17 17:26:54.693506 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.693506 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9lmqr"] Apr 17 17:26:54.693645 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.693579 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:26:54.695924 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.695905 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:26:54.696049 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.695927 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:26:54.696049 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.695909 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-29fk4\"" Apr 17 17:26:54.696156 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.696139 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:26:54.776703 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.776660 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:26:54.776903 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.776711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9bd2cdcc-c4b5-446f-8f64-6c123730399d-config-volume\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.776903 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.776760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.776903 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.776779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9bd2cdcc-c4b5-446f-8f64-6c123730399d-tmp-dir\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.776903 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.776807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6csr\" (UniqueName: \"kubernetes.io/projected/9bd2cdcc-c4b5-446f-8f64-6c123730399d-kube-api-access-l6csr\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.776903 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.776830 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz7lc\" (UniqueName: \"kubernetes.io/projected/802564e4-cdb1-4a5c-80f9-814bd584caa0-kube-api-access-mz7lc\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:26:54.851192 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.851161 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:54.851321 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.851168 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:54.851377 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.851168 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:54.854281 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.854258 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:26:54.854281 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.854276 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:26:54.854461 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.854424 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q7kb6\"" Apr 17 17:26:54.855141 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.855124 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:26:54.855237 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.855147 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:26:54.855365 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.855352 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-r46n5\"" Apr 17 17:26:54.878156 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.878124 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.878156 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.878159 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9bd2cdcc-c4b5-446f-8f64-6c123730399d-tmp-dir\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.878345 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.878180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6csr\" (UniqueName: \"kubernetes.io/projected/9bd2cdcc-c4b5-446f-8f64-6c123730399d-kube-api-access-l6csr\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.878345 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:54.878277 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:54.878345 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.878314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mz7lc\" (UniqueName: \"kubernetes.io/projected/802564e4-cdb1-4a5c-80f9-814bd584caa0-kube-api-access-mz7lc\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:26:54.878488 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:54.878351 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls podName:9bd2cdcc-c4b5-446f-8f64-6c123730399d nodeName:}" failed. No retries permitted until 2026-04-17 17:26:55.378328715 +0000 UTC m=+33.100271691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls") pod "dns-default-55f58" (UID: "9bd2cdcc-c4b5-446f-8f64-6c123730399d") : secret "dns-default-metrics-tls" not found Apr 17 17:26:54.878488 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.878450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:26:54.878609 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.878513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9bd2cdcc-c4b5-446f-8f64-6c123730399d-config-volume\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.878609 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:54.878535 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:54.878609 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.878547 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9bd2cdcc-c4b5-446f-8f64-6c123730399d-tmp-dir\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.878609 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:54.878577 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert podName:802564e4-cdb1-4a5c-80f9-814bd584caa0 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:55.378562223 +0000 UTC m=+33.100505216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert") pod "ingress-canary-9lmqr" (UID: "802564e4-cdb1-4a5c-80f9-814bd584caa0") : secret "canary-serving-cert" not found Apr 17 17:26:54.879547 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.879524 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9bd2cdcc-c4b5-446f-8f64-6c123730399d-config-volume\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.890882 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.890855 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6csr\" (UniqueName: \"kubernetes.io/projected/9bd2cdcc-c4b5-446f-8f64-6c123730399d-kube-api-access-l6csr\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:54.891012 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:54.890884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz7lc\" (UniqueName: \"kubernetes.io/projected/802564e4-cdb1-4a5c-80f9-814bd584caa0-kube-api-access-mz7lc\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:26:55.382442 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:55.382392 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:55.382668 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:55.382515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:26:55.382668 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:55.382566 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:55.382668 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:55.382622 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:55.382668 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:55.382636 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls podName:9bd2cdcc-c4b5-446f-8f64-6c123730399d nodeName:}" failed. No retries permitted until 2026-04-17 17:26:56.382620236 +0000 UTC m=+34.104563216 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls") pod "dns-default-55f58" (UID: "9bd2cdcc-c4b5-446f-8f64-6c123730399d") : secret "dns-default-metrics-tls" not found Apr 17 17:26:55.382668 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:55.382669 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert podName:802564e4-cdb1-4a5c-80f9-814bd584caa0 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:56.382653643 +0000 UTC m=+34.104596631 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert") pod "ingress-canary-9lmqr" (UID: "802564e4-cdb1-4a5c-80f9-814bd584caa0") : secret "canary-serving-cert" not found Apr 17 17:26:55.584513 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:55.584457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:26:55.584640 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:55.584620 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:26:55.584708 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:55.584697 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs podName:bcb4d874-10b6-4167-b452-800ed19b3f79 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:27.584677499 +0000 UTC m=+65.306620487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs") pod "network-metrics-daemon-p9f9z" (UID: "bcb4d874-10b6-4167-b452-800ed19b3f79") : secret "metrics-daemon-secret" not found Apr 17 17:26:55.685393 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:55.685299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs4pm\" (UniqueName: \"kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm\") pod \"network-check-target-d4gnk\" (UID: \"f5fc0e79-0e28-4fcc-891b-18afdb313f11\") " pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:55.687780 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:55.687753 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs4pm\" (UniqueName: \"kubernetes.io/projected/f5fc0e79-0e28-4fcc-891b-18afdb313f11-kube-api-access-vs4pm\") pod \"network-check-target-d4gnk\" (UID: \"f5fc0e79-0e28-4fcc-891b-18afdb313f11\") " pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:55.763888 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:55.763850 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:26:55.886975 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:55.886939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:55.889319 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:55.889299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74142d91-eb23-411d-8c68-16c329d30680-original-pull-secret\") pod \"global-pull-secret-syncer-6mktk\" (UID: \"74142d91-eb23-411d-8c68-16c329d30680\") " pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:55.901936 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:55.901724 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d4gnk"] Apr 17 17:26:55.907059 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:55.907032 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5fc0e79_0e28_4fcc_891b_18afdb313f11.slice/crio-bbe44a99f8b1e81c53054c7154d1372c819fc1a051fb48036b16656c624a69db WatchSource:0}: Error finding container bbe44a99f8b1e81c53054c7154d1372c819fc1a051fb48036b16656c624a69db: Status 404 returned error can't find the container with id bbe44a99f8b1e81c53054c7154d1372c819fc1a051fb48036b16656c624a69db Apr 17 17:26:56.068801 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:56.068765 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6mktk" Apr 17 17:26:56.157788 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:56.157755 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d4gnk" event={"ID":"f5fc0e79-0e28-4fcc-891b-18afdb313f11","Type":"ContainerStarted","Data":"bbe44a99f8b1e81c53054c7154d1372c819fc1a051fb48036b16656c624a69db"} Apr 17 17:26:56.185581 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:56.185550 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6mktk"] Apr 17 17:26:56.188608 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:26:56.188581 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74142d91_eb23_411d_8c68_16c329d30680.slice/crio-565f9cbde53c29fda5b265773238eb5181fa9a072bec41a6e8707631df65ffd6 WatchSource:0}: Error finding container 565f9cbde53c29fda5b265773238eb5181fa9a072bec41a6e8707631df65ffd6: Status 404 returned error can't find the container with id 565f9cbde53c29fda5b265773238eb5181fa9a072bec41a6e8707631df65ffd6 Apr 17 17:26:56.389829 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:56.389718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:26:56.389829 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:56.389788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:56.390044 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:56.389880 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:56.390044 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:56.389887 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:56.390044 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:56.389963 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls podName:9bd2cdcc-c4b5-446f-8f64-6c123730399d nodeName:}" failed. No retries permitted until 2026-04-17 17:26:58.389928626 +0000 UTC m=+36.111871602 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls") pod "dns-default-55f58" (UID: "9bd2cdcc-c4b5-446f-8f64-6c123730399d") : secret "dns-default-metrics-tls" not found Apr 17 17:26:56.390044 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:56.389994 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert podName:802564e4-cdb1-4a5c-80f9-814bd584caa0 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:58.389984783 +0000 UTC m=+36.111927759 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert") pod "ingress-canary-9lmqr" (UID: "802564e4-cdb1-4a5c-80f9-814bd584caa0") : secret "canary-serving-cert" not found Apr 17 17:26:57.161844 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:57.161803 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6mktk" event={"ID":"74142d91-eb23-411d-8c68-16c329d30680","Type":"ContainerStarted","Data":"565f9cbde53c29fda5b265773238eb5181fa9a072bec41a6e8707631df65ffd6"} Apr 17 17:26:58.405228 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:58.405070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:26:58.405748 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:58.405237 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:58.405748 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:58.405307 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert podName:802564e4-cdb1-4a5c-80f9-814bd584caa0 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:02.405286211 +0000 UTC m=+40.127229189 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert") pod "ingress-canary-9lmqr" (UID: "802564e4-cdb1-4a5c-80f9-814bd584caa0") : secret "canary-serving-cert" not found Apr 17 17:26:58.405748 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:26:58.405329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:26:58.405748 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:58.405456 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:58.405748 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:26:58.405514 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls podName:9bd2cdcc-c4b5-446f-8f64-6c123730399d nodeName:}" failed. No retries permitted until 2026-04-17 17:27:02.405502507 +0000 UTC m=+40.127445482 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls") pod "dns-default-55f58" (UID: "9bd2cdcc-c4b5-446f-8f64-6c123730399d") : secret "dns-default-metrics-tls" not found Apr 17 17:27:01.170983 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:01.170936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6mktk" event={"ID":"74142d91-eb23-411d-8c68-16c329d30680","Type":"ContainerStarted","Data":"3dc6125b0a31fa60c1f78cbb220ab80831c152eb83ca497f1782b06b1d501503"} Apr 17 17:27:01.172259 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:01.172235 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d4gnk" event={"ID":"f5fc0e79-0e28-4fcc-891b-18afdb313f11","Type":"ContainerStarted","Data":"712c49a5ea8126be8d7696eeb4172c5e2138038b5fd31009fd310705be8bb23d"} Apr 17 17:27:01.172377 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:01.172367 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:27:01.186110 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:01.186058 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6mktk" podStartSLOduration=32.727073382 podStartE2EDuration="37.186046253s" podCreationTimestamp="2026-04-17 17:26:24 +0000 UTC" firstStartedPulling="2026-04-17 17:26:56.190297489 +0000 UTC m=+33.912240465" lastFinishedPulling="2026-04-17 17:27:00.649270352 +0000 UTC m=+38.371213336" observedRunningTime="2026-04-17 17:27:01.185096023 +0000 UTC m=+38.907039021" watchObservedRunningTime="2026-04-17 17:27:01.186046253 +0000 UTC m=+38.907989296" Apr 17 17:27:01.201436 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:01.201393 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-d4gnk" podStartSLOduration=33.471093342 podStartE2EDuration="38.201381035s" podCreationTimestamp="2026-04-17 17:26:23 +0000 UTC" firstStartedPulling="2026-04-17 17:26:55.908449083 +0000 UTC m=+33.630392059" lastFinishedPulling="2026-04-17 17:27:00.638736768 +0000 UTC m=+38.360679752" observedRunningTime="2026-04-17 17:27:01.200714114 +0000 UTC m=+38.922657135" watchObservedRunningTime="2026-04-17 17:27:01.201381035 +0000 UTC m=+38.923324029" Apr 17 17:27:02.431322 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:02.431277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:27:02.431749 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:02.431341 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:27:02.431749 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:02.431445 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:27:02.431749 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:02.431444 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:27:02.431749 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:02.431523 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert podName:802564e4-cdb1-4a5c-80f9-814bd584caa0 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:10.431509306 +0000 UTC m=+48.153452282 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert") pod "ingress-canary-9lmqr" (UID: "802564e4-cdb1-4a5c-80f9-814bd584caa0") : secret "canary-serving-cert" not found Apr 17 17:27:02.431749 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:02.431537 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls podName:9bd2cdcc-c4b5-446f-8f64-6c123730399d nodeName:}" failed. No retries permitted until 2026-04-17 17:27:10.431530989 +0000 UTC m=+48.153473965 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls") pod "dns-default-55f58" (UID: "9bd2cdcc-c4b5-446f-8f64-6c123730399d") : secret "dns-default-metrics-tls" not found Apr 17 17:27:03.485274 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:03.485216 2577 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df: reading manifest sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df" Apr 17 17:27:03.485671 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:03.485415 2577 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:whereabouts-cni-bincopy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/whereabouts/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/whereabouts/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/whereabouts/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dn7s8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-pdpfh_openshift-multus(355b3a4d-4123-4e80-a76f-e42bcfb92020): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df: reading manifest sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:27:03.486580 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:03.486551 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"whereabouts-cni-bincopy\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df: reading manifest sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" podUID="355b3a4d-4123-4e80-a76f-e42bcfb92020" Apr 17 17:27:04.179482 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:04.179435 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"whereabouts-cni-bincopy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df: reading manifest sha256:5c4d9f8fd250636b548d533d8f0af8bd5494ad6e5026569cefc634f0283d50df in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" podUID="355b3a4d-4123-4e80-a76f-e42bcfb92020" Apr 17 17:27:04.809795 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.809755 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h"] Apr 17 17:27:04.812672 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.812650 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" Apr 17 17:27:04.814966 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.814946 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 17:27:04.815069 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.815051 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-c7vgl\"" Apr 17 17:27:04.815114 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.815070 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 17:27:04.815555 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.815542 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 17:27:04.815611 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.815569 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 17:27:04.822133 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.822109 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h"] Apr 17 17:27:04.841627 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.841595 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx"] Apr 17 17:27:04.844332 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.844313 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:04.846337 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.846316 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 17:27:04.853758 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.853734 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx"] Apr 17 17:27:04.949250 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.949215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ecce913b-c258-42db-8465-b935c4a7b028-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7754c6b749-v428h\" (UID: \"ecce913b-c258-42db-8465-b935c4a7b028\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" Apr 17 17:27:04.949250 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.949259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s42zf\" (UniqueName: \"kubernetes.io/projected/ecce913b-c258-42db-8465-b935c4a7b028-kube-api-access-s42zf\") pod \"managed-serviceaccount-addon-agent-7754c6b749-v428h\" (UID: \"ecce913b-c258-42db-8465-b935c4a7b028\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" Apr 17 17:27:04.949511 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.949346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f45b57878-7zrzx\" (UID: \"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:04.949511 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.949406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de-tmp\") pod \"klusterlet-addon-workmgr-7f45b57878-7zrzx\" (UID: \"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:04.949632 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:04.949511 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw4bw\" (UniqueName: \"kubernetes.io/projected/6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de-kube-api-access-qw4bw\") pod \"klusterlet-addon-workmgr-7f45b57878-7zrzx\" (UID: \"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:05.050623 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.050572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s42zf\" (UniqueName: \"kubernetes.io/projected/ecce913b-c258-42db-8465-b935c4a7b028-kube-api-access-s42zf\") pod \"managed-serviceaccount-addon-agent-7754c6b749-v428h\" (UID: \"ecce913b-c258-42db-8465-b935c4a7b028\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" Apr 17 17:27:05.050730 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.050645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f45b57878-7zrzx\" (UID: \"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:05.050730 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.050677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de-tmp\") pod \"klusterlet-addon-workmgr-7f45b57878-7zrzx\" (UID: \"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:05.050887 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.050731 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw4bw\" (UniqueName: \"kubernetes.io/projected/6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de-kube-api-access-qw4bw\") pod \"klusterlet-addon-workmgr-7f45b57878-7zrzx\" (UID: \"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:05.050887 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.050785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ecce913b-c258-42db-8465-b935c4a7b028-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7754c6b749-v428h\" (UID: \"ecce913b-c258-42db-8465-b935c4a7b028\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" Apr 17 17:27:05.051276 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.051252 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de-tmp\") pod \"klusterlet-addon-workmgr-7f45b57878-7zrzx\" (UID: \"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:05.053990 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.053966 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ecce913b-c258-42db-8465-b935c4a7b028-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7754c6b749-v428h\" (UID: \"ecce913b-c258-42db-8465-b935c4a7b028\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" Apr 17 17:27:05.054087 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.054066 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f45b57878-7zrzx\" (UID: \"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:05.058501 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.058461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s42zf\" (UniqueName: \"kubernetes.io/projected/ecce913b-c258-42db-8465-b935c4a7b028-kube-api-access-s42zf\") pod \"managed-serviceaccount-addon-agent-7754c6b749-v428h\" (UID: \"ecce913b-c258-42db-8465-b935c4a7b028\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" Apr 17 17:27:05.058601 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.058524 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw4bw\" (UniqueName: \"kubernetes.io/projected/6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de-kube-api-access-qw4bw\") pod \"klusterlet-addon-workmgr-7f45b57878-7zrzx\" (UID: \"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:05.134179 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.134084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" Apr 17 17:27:05.152967 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.152936 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:05.251834 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.251804 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h"] Apr 17 17:27:05.254951 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:27:05.254924 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecce913b_c258_42db_8465_b935c4a7b028.slice/crio-eaa72ece5c4498ac0cb2fa20e64d797f7c589ff62c2bd13bdcad4c56d4650784 WatchSource:0}: Error finding container eaa72ece5c4498ac0cb2fa20e64d797f7c589ff62c2bd13bdcad4c56d4650784: Status 404 returned error can't find the container with id eaa72ece5c4498ac0cb2fa20e64d797f7c589ff62c2bd13bdcad4c56d4650784 Apr 17 17:27:05.270890 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:05.270868 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx"] Apr 17 17:27:05.273779 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:27:05.273758 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e9408b6_5cf2_4fc7_bb52_f5f36b2c35de.slice/crio-db4bf8b122a18e2c7af1847cd6a48ae5a6935ee9f7058883a9ac507f1ab4d870 WatchSource:0}: Error finding container db4bf8b122a18e2c7af1847cd6a48ae5a6935ee9f7058883a9ac507f1ab4d870: Status 404 returned error can't find the container with id db4bf8b122a18e2c7af1847cd6a48ae5a6935ee9f7058883a9ac507f1ab4d870 Apr 17 17:27:06.184321 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:06.184269 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" event={"ID":"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de","Type":"ContainerStarted","Data":"db4bf8b122a18e2c7af1847cd6a48ae5a6935ee9f7058883a9ac507f1ab4d870"} Apr 17 17:27:06.185547 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:06.185495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" event={"ID":"ecce913b-c258-42db-8465-b935c4a7b028","Type":"ContainerStarted","Data":"eaa72ece5c4498ac0cb2fa20e64d797f7c589ff62c2bd13bdcad4c56d4650784"} Apr 17 17:27:09.192039 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:09.191984 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" event={"ID":"ecce913b-c258-42db-8465-b935c4a7b028","Type":"ContainerStarted","Data":"91f13ce438ce59ab8733a3bd137c2a85c56eb9b610b8633dc7a0830a58df4c64"} Apr 17 17:27:09.206737 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:09.206683 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" podStartSLOduration=1.8588218909999998 podStartE2EDuration="5.206669525s" podCreationTimestamp="2026-04-17 17:27:04 +0000 UTC" firstStartedPulling="2026-04-17 17:27:05.256730775 +0000 UTC m=+42.978673763" lastFinishedPulling="2026-04-17 17:27:08.604578411 +0000 UTC m=+46.326521397" observedRunningTime="2026-04-17 17:27:09.205564665 +0000 UTC m=+46.927507663" watchObservedRunningTime="2026-04-17 17:27:09.206669525 +0000 UTC m=+46.928612521" Apr 17 17:27:10.494641 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:10.494600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:27:10.495027 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:10.494672 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:27:10.495027 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:10.494774 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:27:10.495027 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:10.494801 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:27:10.495027 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:10.494887 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert podName:802564e4-cdb1-4a5c-80f9-814bd584caa0 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:26.494865007 +0000 UTC m=+64.216807997 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert") pod "ingress-canary-9lmqr" (UID: "802564e4-cdb1-4a5c-80f9-814bd584caa0") : secret "canary-serving-cert" not found Apr 17 17:27:10.495027 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:10.494910 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls podName:9bd2cdcc-c4b5-446f-8f64-6c123730399d nodeName:}" failed. No retries permitted until 2026-04-17 17:27:26.494898657 +0000 UTC m=+64.216841637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls") pod "dns-default-55f58" (UID: "9bd2cdcc-c4b5-446f-8f64-6c123730399d") : secret "dns-default-metrics-tls" not found Apr 17 17:27:12.198942 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:12.198902 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" event={"ID":"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de","Type":"ContainerStarted","Data":"c67782ef64fb8c9755b995511389496873d7c143753d9a671d26bdf9f011013c"} Apr 17 17:27:12.199376 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:12.199099 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:12.200950 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:12.200929 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:27:12.214259 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:12.214217 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" podStartSLOduration=2.010544453 podStartE2EDuration="8.214205034s" podCreationTimestamp="2026-04-17 17:27:04 +0000 UTC" firstStartedPulling="2026-04-17 17:27:05.275663181 +0000 UTC m=+42.997606159" lastFinishedPulling="2026-04-17 17:27:11.47932375 +0000 UTC m=+49.201266740" observedRunningTime="2026-04-17 17:27:12.214166024 +0000 UTC m=+49.936109020" watchObservedRunningTime="2026-04-17 17:27:12.214205034 +0000 UTC m=+49.936148031" Apr 17 17:27:21.155583 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:21.155551 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rjgx" Apr 17 17:27:22.220264 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:22.220230 2577 generic.go:358] "Generic (PLEG): container finished" podID="355b3a4d-4123-4e80-a76f-e42bcfb92020" containerID="be002b6792ca6dec48e0ad1fbef61985daf94db84b06fd00a0c9eacbbc988a09" exitCode=0 Apr 17 17:27:22.220718 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:22.220304 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" event={"ID":"355b3a4d-4123-4e80-a76f-e42bcfb92020","Type":"ContainerDied","Data":"be002b6792ca6dec48e0ad1fbef61985daf94db84b06fd00a0c9eacbbc988a09"} Apr 17 17:27:23.224312 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:23.224279 2577 generic.go:358] "Generic (PLEG): container finished" podID="355b3a4d-4123-4e80-a76f-e42bcfb92020" containerID="7a9ca2774ffe3cc03a4e180236be77375f9279dec4d5cf9320314dc4ef74d358" exitCode=0 Apr 17 17:27:23.224312 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:23.224324 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" event={"ID":"355b3a4d-4123-4e80-a76f-e42bcfb92020","Type":"ContainerDied","Data":"7a9ca2774ffe3cc03a4e180236be77375f9279dec4d5cf9320314dc4ef74d358"} Apr 17 17:27:24.229362 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:24.229329 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" event={"ID":"355b3a4d-4123-4e80-a76f-e42bcfb92020","Type":"ContainerStarted","Data":"594de955a450b14d17e9aaf8bc453778ba4e914bbfe3b610e95173f2efc8366b"} Apr 17 17:27:24.251730 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:24.251676 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pdpfh" podStartSLOduration=4.625125277 podStartE2EDuration="1m2.251661047s" podCreationTimestamp="2026-04-17 17:26:22 +0000 UTC" firstStartedPulling="2026-04-17 17:26:24.161891551 +0000 UTC m=+1.883834540" lastFinishedPulling="2026-04-17 17:27:21.788427332 +0000 UTC m=+59.510370310" observedRunningTime="2026-04-17 17:27:24.250310699 +0000 UTC m=+61.972253695" watchObservedRunningTime="2026-04-17 17:27:24.251661047 +0000 UTC m=+61.973604073" Apr 17 17:27:26.505702 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:26.505643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:27:26.506222 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:26.505727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:27:26.506222 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:26.505787 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:27:26.506222 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:26.505851 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:27:26.506222 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:26.505856 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls podName:9bd2cdcc-c4b5-446f-8f64-6c123730399d nodeName:}" failed. No retries permitted until 2026-04-17 17:27:58.505840743 +0000 UTC m=+96.227783733 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls") pod "dns-default-55f58" (UID: "9bd2cdcc-c4b5-446f-8f64-6c123730399d") : secret "dns-default-metrics-tls" not found Apr 17 17:27:26.506222 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:26.505927 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert podName:802564e4-cdb1-4a5c-80f9-814bd584caa0 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:58.505911686 +0000 UTC m=+96.227854661 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert") pod "ingress-canary-9lmqr" (UID: "802564e4-cdb1-4a5c-80f9-814bd584caa0") : secret "canary-serving-cert" not found Apr 17 17:27:27.614386 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:27.614342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:27:27.614927 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:27.614531 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:27:27.614927 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:27.614634 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs podName:bcb4d874-10b6-4167-b452-800ed19b3f79 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:31.6146112 +0000 UTC m=+129.336554179 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs") pod "network-metrics-daemon-p9f9z" (UID: "bcb4d874-10b6-4167-b452-800ed19b3f79") : secret "metrics-daemon-secret" not found Apr 17 17:27:32.177380 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:32.177347 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-d4gnk" Apr 17 17:27:58.532312 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:58.532268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:27:58.532756 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:27:58.532336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:27:58.532756 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:58.532417 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:27:58.532756 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:58.532504 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert podName:802564e4-cdb1-4a5c-80f9-814bd584caa0 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:02.532464916 +0000 UTC m=+160.254407892 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert") pod "ingress-canary-9lmqr" (UID: "802564e4-cdb1-4a5c-80f9-814bd584caa0") : secret "canary-serving-cert" not found Apr 17 17:27:58.532756 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:58.532423 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:27:58.532756 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:27:58.532561 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls podName:9bd2cdcc-c4b5-446f-8f64-6c123730399d nodeName:}" failed. No retries permitted until 2026-04-17 17:29:02.532548312 +0000 UTC m=+160.254491287 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls") pod "dns-default-55f58" (UID: "9bd2cdcc-c4b5-446f-8f64-6c123730399d") : secret "dns-default-metrics-tls" not found Apr 17 17:28:27.821268 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.821236 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7dfb49b74-jlm7c"] Apr 17 17:28:27.824091 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.824071 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb"] Apr 17 17:28:27.824244 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.824223 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:27.826568 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.826545 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 17:28:27.826568 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.826559 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vrfvt\"" Apr 17 17:28:27.826764 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.826602 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 17:28:27.826764 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.826621 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:28:27.826764 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.826548 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:28:27.826764 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.826552 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 17:28:27.826970 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.826953 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 17:28:27.827099 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.827082 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:27.829022 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.829004 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 17:28:27.829757 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.829620 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:28:27.829757 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.829643 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:28:27.831028 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.831009 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-9gggc\"" Apr 17 17:28:27.831597 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.831540 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 17:28:27.836932 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.836903 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb"] Apr 17 17:28:27.837939 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.837915 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7dfb49b74-jlm7c"] Apr 17 17:28:27.940417 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.940375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:27.940417 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.940425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:27.940664 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.940446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:27.940664 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.940494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-stats-auth\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:27.940664 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.940515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf22j\" (UniqueName: \"kubernetes.io/projected/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-kube-api-access-jf22j\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:27.940664 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.940559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d9mt\" (UniqueName: \"kubernetes.io/projected/7ec34672-cf7b-48a2-a580-01d0d51e08b1-kube-api-access-5d9mt\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:27.940664 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.940601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:27.940854 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:27.940679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-default-certificate\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:28.041583 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.041544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-default-certificate\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:28.041583 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.041586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:28.041870 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.041616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:28.041870 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.041644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:28.041870 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:28.041709 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:28.041870 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.041727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-stats-auth\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:28.041870 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.041763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jf22j\" (UniqueName: \"kubernetes.io/projected/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-kube-api-access-jf22j\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:28.041870 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:28.041781 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls podName:c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf nodeName:}" failed. No retries permitted until 2026-04-17 17:28:28.541759513 +0000 UTC m=+126.263702511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kx9nb" (UID: "c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:28.041870 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:28.041708 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:28:28.041870 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:28.041869 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs podName:7ec34672-cf7b-48a2-a580-01d0d51e08b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:28.541848403 +0000 UTC m=+126.263791388 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs") pod "router-default-7dfb49b74-jlm7c" (UID: "7ec34672-cf7b-48a2-a580-01d0d51e08b1") : secret "router-metrics-certs-default" not found Apr 17 17:28:28.042278 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.041918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5d9mt\" (UniqueName: \"kubernetes.io/projected/7ec34672-cf7b-48a2-a580-01d0d51e08b1-kube-api-access-5d9mt\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:28.042278 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.041956 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:28.042278 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:28.042082 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle podName:7ec34672-cf7b-48a2-a580-01d0d51e08b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:28.542070147 +0000 UTC m=+126.264013137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle") pod "router-default-7dfb49b74-jlm7c" (UID: "7ec34672-cf7b-48a2-a580-01d0d51e08b1") : configmap references non-existent config key: service-ca.crt Apr 17 17:28:28.042425 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.042402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:28.044071 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.044051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-default-certificate\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:28.044144 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.044108 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-stats-auth\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:28.052223 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.052193 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d9mt\" (UniqueName: \"kubernetes.io/projected/7ec34672-cf7b-48a2-a580-01d0d51e08b1-kube-api-access-5d9mt\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:28.052341 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.052195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf22j\" (UniqueName: \"kubernetes.io/projected/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-kube-api-access-jf22j\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:28.544418 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.544359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:28.544418 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.544416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:28.544639 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:28.544539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:28.544639 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:28.544539 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:28:28.544639 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:28.544618 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs podName:7ec34672-cf7b-48a2-a580-01d0d51e08b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:29.544601622 +0000 UTC m=+127.266544612 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs") pod "router-default-7dfb49b74-jlm7c" (UID: "7ec34672-cf7b-48a2-a580-01d0d51e08b1") : secret "router-metrics-certs-default" not found Apr 17 17:28:28.544639 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:28.544635 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle podName:7ec34672-cf7b-48a2-a580-01d0d51e08b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:29.544624447 +0000 UTC m=+127.266567422 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle") pod "router-default-7dfb49b74-jlm7c" (UID: "7ec34672-cf7b-48a2-a580-01d0d51e08b1") : configmap references non-existent config key: service-ca.crt Apr 17 17:28:28.544770 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:28.544647 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:28.544770 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:28.544705 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls podName:c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf nodeName:}" failed. No retries permitted until 2026-04-17 17:28:29.54469503 +0000 UTC m=+127.266638006 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kx9nb" (UID: "c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:29.551143 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:29.551096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:29.551143 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:29.551172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:29.551631 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:29.551210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:29.551631 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:29.551263 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:29.551631 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:29.551293 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:28:29.551631 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:29.551347 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls podName:c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf nodeName:}" failed. No retries permitted until 2026-04-17 17:28:31.551326175 +0000 UTC m=+129.273269152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kx9nb" (UID: "c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:29.551631 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:29.551367 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle podName:7ec34672-cf7b-48a2-a580-01d0d51e08b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:31.551358596 +0000 UTC m=+129.273301571 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle") pod "router-default-7dfb49b74-jlm7c" (UID: "7ec34672-cf7b-48a2-a580-01d0d51e08b1") : configmap references non-existent config key: service-ca.crt Apr 17 17:28:29.551631 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:29.551385 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs podName:7ec34672-cf7b-48a2-a580-01d0d51e08b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:31.551375717 +0000 UTC m=+129.273318696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs") pod "router-default-7dfb49b74-jlm7c" (UID: "7ec34672-cf7b-48a2-a580-01d0d51e08b1") : secret "router-metrics-certs-default" not found Apr 17 17:28:31.567079 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:31.567040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:31.567568 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:31.567133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:31.567568 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:31.567163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:31.567568 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:31.567228 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle podName:7ec34672-cf7b-48a2-a580-01d0d51e08b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:35.567208837 +0000 UTC m=+133.289151830 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle") pod "router-default-7dfb49b74-jlm7c" (UID: "7ec34672-cf7b-48a2-a580-01d0d51e08b1") : configmap references non-existent config key: service-ca.crt Apr 17 17:28:31.567568 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:31.567271 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:31.567568 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:31.567317 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls podName:c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf nodeName:}" failed. No retries permitted until 2026-04-17 17:28:35.567306145 +0000 UTC m=+133.289249122 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kx9nb" (UID: "c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:31.567568 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:31.567270 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:28:31.567568 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:31.567346 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs podName:7ec34672-cf7b-48a2-a580-01d0d51e08b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:35.567340224 +0000 UTC m=+133.289283200 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs") pod "router-default-7dfb49b74-jlm7c" (UID: "7ec34672-cf7b-48a2-a580-01d0d51e08b1") : secret "router-metrics-certs-default" not found Apr 17 17:28:31.667924 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:31.667884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:28:31.668090 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:31.668021 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:28:31.668090 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:31.668081 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs podName:bcb4d874-10b6-4167-b452-800ed19b3f79 nodeName:}" failed. No retries permitted until 2026-04-17 17:30:33.668064715 +0000 UTC m=+251.390007700 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs") pod "network-metrics-daemon-p9f9z" (UID: "bcb4d874-10b6-4167-b452-800ed19b3f79") : secret "metrics-daemon-secret" not found Apr 17 17:28:35.062497 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:35.062457 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bjxzb_ca904b14-b665-4107-bf21-c1783df952e4/dns-node-resolver/0.log" Apr 17 17:28:35.600932 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:35.600877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:35.601168 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:35.600968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:35.601168 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:35.601004 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:35.601168 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:35.601064 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle podName:7ec34672-cf7b-48a2-a580-01d0d51e08b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:43.601042081 +0000 UTC m=+141.322985071 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle") pod "router-default-7dfb49b74-jlm7c" (UID: "7ec34672-cf7b-48a2-a580-01d0d51e08b1") : configmap references non-existent config key: service-ca.crt Apr 17 17:28:35.601168 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:35.601107 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:35.601168 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:35.601114 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:28:35.601168 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:35.601146 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls podName:c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf nodeName:}" failed. No retries permitted until 2026-04-17 17:28:43.601135384 +0000 UTC m=+141.323078366 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kx9nb" (UID: "c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:35.601387 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:35.601177 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs podName:7ec34672-cf7b-48a2-a580-01d0d51e08b1 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:43.601158891 +0000 UTC m=+141.323101879 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs") pod "router-default-7dfb49b74-jlm7c" (UID: "7ec34672-cf7b-48a2-a580-01d0d51e08b1") : secret "router-metrics-certs-default" not found Apr 17 17:28:35.863518 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:35.863417 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gb2kr_255274fc-6f71-45da-a2f9-c715044eee61/node-ca/0.log" Apr 17 17:28:43.672511 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:43.672438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:43.673122 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:43.672606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:43.673122 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:43.672637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:43.673122 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:43.672735 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:43.673122 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:43.672794 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls podName:c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf nodeName:}" failed. No retries permitted until 2026-04-17 17:28:59.672779231 +0000 UTC m=+157.394722208 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kx9nb" (UID: "c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:28:43.673329 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:43.673240 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ec34672-cf7b-48a2-a580-01d0d51e08b1-service-ca-bundle\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:43.674984 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:43.674962 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ec34672-cf7b-48a2-a580-01d0d51e08b1-metrics-certs\") pod \"router-default-7dfb49b74-jlm7c\" (UID: \"7ec34672-cf7b-48a2-a580-01d0d51e08b1\") " pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:43.737168 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:43.737121 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:43.862094 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:43.862059 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7dfb49b74-jlm7c"] Apr 17 17:28:43.865010 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:28:43.864976 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec34672_cf7b_48a2_a580_01d0d51e08b1.slice/crio-223f8fb41f01257e389c724543f56b7f36c36306a670340bd4e5335abbb0cfcc WatchSource:0}: Error finding container 223f8fb41f01257e389c724543f56b7f36c36306a670340bd4e5335abbb0cfcc: Status 404 returned error can't find the container with id 223f8fb41f01257e389c724543f56b7f36c36306a670340bd4e5335abbb0cfcc Apr 17 17:28:44.392002 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:44.391966 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dfb49b74-jlm7c" event={"ID":"7ec34672-cf7b-48a2-a580-01d0d51e08b1","Type":"ContainerStarted","Data":"45eeb7b295462c5d695134452ef7cd5e1852dc7ec4ede2284183f2f9ebba27b2"} Apr 17 17:28:44.392002 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:44.391999 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dfb49b74-jlm7c" event={"ID":"7ec34672-cf7b-48a2-a580-01d0d51e08b1","Type":"ContainerStarted","Data":"223f8fb41f01257e389c724543f56b7f36c36306a670340bd4e5335abbb0cfcc"} Apr 17 17:28:44.413102 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:44.413055 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7dfb49b74-jlm7c" podStartSLOduration=17.413038978 podStartE2EDuration="17.413038978s" podCreationTimestamp="2026-04-17 17:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:28:44.412699747 +0000 UTC m=+142.134642756" watchObservedRunningTime="2026-04-17 17:28:44.413038978 +0000 UTC m=+142.134981977" Apr 17 17:28:44.737962 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:44.737925 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:44.740539 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:44.740515 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:45.394616 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:45.394580 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:45.395815 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:45.395792 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7dfb49b74-jlm7c" Apr 17 17:28:54.074694 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.074649 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dqmbb"] Apr 17 17:28:54.077097 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.077075 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.080168 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.080140 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:28:54.080168 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.080157 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:28:54.080362 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.080140 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jslpj\"" Apr 17 17:28:54.080529 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.080453 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:28:54.080529 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.080523 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:28:54.089925 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.089898 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dqmbb"] Apr 17 17:28:54.151587 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.151547 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/021cf46a-9b84-480a-acfc-b41c0da1ca7a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.151752 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.151594 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/021cf46a-9b84-480a-acfc-b41c0da1ca7a-data-volume\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.151752 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.151707 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7r7q\" (UniqueName: \"kubernetes.io/projected/021cf46a-9b84-480a-acfc-b41c0da1ca7a-kube-api-access-j7r7q\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.151850 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.151810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/021cf46a-9b84-480a-acfc-b41c0da1ca7a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.151888 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.151869 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/021cf46a-9b84-480a-acfc-b41c0da1ca7a-crio-socket\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.159848 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.159816 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5b45f68c89-wvxkd"] Apr 17 17:28:54.161789 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.161772 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.164578 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.164556 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:28:54.164803 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.164788 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:28:54.164889 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.164789 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-pcbgd\"" Apr 17 17:28:54.167411 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.167393 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:28:54.172772 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.172753 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:28:54.181787 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.181762 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b45f68c89-wvxkd"] Apr 17 17:28:54.252483 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252445 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh9kr\" (UniqueName: \"kubernetes.io/projected/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-kube-api-access-qh9kr\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.252691 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/021cf46a-9b84-480a-acfc-b41c0da1ca7a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.252691 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-trusted-ca\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.252691 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/021cf46a-9b84-480a-acfc-b41c0da1ca7a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.252861 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252693 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7r7q\" (UniqueName: \"kubernetes.io/projected/021cf46a-9b84-480a-acfc-b41c0da1ca7a-kube-api-access-j7r7q\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.252861 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252724 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-registry-certificates\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.252861 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252741 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-ca-trust-extracted\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.252861 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-bound-sa-token\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.252861 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/021cf46a-9b84-480a-acfc-b41c0da1ca7a-crio-socket\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.252861 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252827 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-image-registry-private-configuration\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.252861 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252861 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-installation-pull-secrets\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.253134 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252889 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/021cf46a-9b84-480a-acfc-b41c0da1ca7a-crio-socket\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.253134 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252902 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-registry-tls\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.253134 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.252943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/021cf46a-9b84-480a-acfc-b41c0da1ca7a-data-volume\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.253134 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.253084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/021cf46a-9b84-480a-acfc-b41c0da1ca7a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.253275 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.253203 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/021cf46a-9b84-480a-acfc-b41c0da1ca7a-data-volume\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.254910 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.254888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/021cf46a-9b84-480a-acfc-b41c0da1ca7a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.268310 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.268280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7r7q\" (UniqueName: \"kubernetes.io/projected/021cf46a-9b84-480a-acfc-b41c0da1ca7a-kube-api-access-j7r7q\") pod \"insights-runtime-extractor-dqmbb\" (UID: \"021cf46a-9b84-480a-acfc-b41c0da1ca7a\") " pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.353616 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.353519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh9kr\" (UniqueName: \"kubernetes.io/projected/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-kube-api-access-qh9kr\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.353616 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.353579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-trusted-ca\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.353842 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.353630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-registry-certificates\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.353842 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.353647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-ca-trust-extracted\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.353842 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.353666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-bound-sa-token\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.353842 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.353687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-image-registry-private-configuration\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.353842 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.353703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-installation-pull-secrets\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.353842 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.353728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-registry-tls\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.354113 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.354080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-ca-trust-extracted\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.354560 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.354538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-registry-certificates\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.354800 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.354778 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-trusted-ca\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.356283 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.356253 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-installation-pull-secrets\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.356283 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.356275 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-registry-tls\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.356436 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.356363 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-image-registry-private-configuration\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.362157 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.362132 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-bound-sa-token\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.362248 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.362232 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh9kr\" (UniqueName: \"kubernetes.io/projected/d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9-kube-api-access-qh9kr\") pod \"image-registry-5b45f68c89-wvxkd\" (UID: \"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9\") " pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.386425 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.386400 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dqmbb" Apr 17 17:28:54.471047 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.471003 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:54.507346 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.507315 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dqmbb"] Apr 17 17:28:54.510980 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:28:54.510922 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod021cf46a_9b84_480a_acfc_b41c0da1ca7a.slice/crio-0b9fb487343a1cb3793dffde3edca29d132891957ecb08f08c2a4724e0ce6f78 WatchSource:0}: Error finding container 0b9fb487343a1cb3793dffde3edca29d132891957ecb08f08c2a4724e0ce6f78: Status 404 returned error can't find the container with id 0b9fb487343a1cb3793dffde3edca29d132891957ecb08f08c2a4724e0ce6f78 Apr 17 17:28:54.606707 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:54.606619 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b45f68c89-wvxkd"] Apr 17 17:28:54.609499 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:28:54.609456 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1f1ce73_bdd1_4cc9_8de9_b45fbaf27ce9.slice/crio-2eb796ae690edf3337f411522e8899f95ab3963750eb9c92e230f218f498890a WatchSource:0}: Error finding container 2eb796ae690edf3337f411522e8899f95ab3963750eb9c92e230f218f498890a: Status 404 returned error can't find the container with id 2eb796ae690edf3337f411522e8899f95ab3963750eb9c92e230f218f498890a Apr 17 17:28:55.420732 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:55.420694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" event={"ID":"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9","Type":"ContainerStarted","Data":"02483a5d8dc6db56703735cb83ebdb70b77673a1ddaffeb1f3a6394f9986dba3"} Apr 17 17:28:55.421182 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:55.420737 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" event={"ID":"d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9","Type":"ContainerStarted","Data":"2eb796ae690edf3337f411522e8899f95ab3963750eb9c92e230f218f498890a"} Apr 17 17:28:55.421182 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:55.420802 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:28:55.422153 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:55.422130 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dqmbb" event={"ID":"021cf46a-9b84-480a-acfc-b41c0da1ca7a","Type":"ContainerStarted","Data":"5d1f429bbb21624d83622184e1d2d1d09e5b6945d879963dc3be7a8c7ce95595"} Apr 17 17:28:55.422246 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:55.422157 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dqmbb" event={"ID":"021cf46a-9b84-480a-acfc-b41c0da1ca7a","Type":"ContainerStarted","Data":"92d9994dd0c7b47de5aae17c420a0497f3aaadace81837dcf27be49fbfb038e2"} Apr 17 17:28:55.422246 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:55.422170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dqmbb" event={"ID":"021cf46a-9b84-480a-acfc-b41c0da1ca7a","Type":"ContainerStarted","Data":"0b9fb487343a1cb3793dffde3edca29d132891957ecb08f08c2a4724e0ce6f78"} Apr 17 17:28:57.429228 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:57.429191 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dqmbb" event={"ID":"021cf46a-9b84-480a-acfc-b41c0da1ca7a","Type":"ContainerStarted","Data":"e599a7e7c45a1e4cc515abd282c41a0c455b8d0913ca9dceb009fcb37a52fa03"} Apr 17 17:28:57.448341 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:57.448295 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dqmbb" podStartSLOduration=1.280927229 podStartE2EDuration="3.448280556s" podCreationTimestamp="2026-04-17 17:28:54 +0000 UTC" firstStartedPulling="2026-04-17 17:28:54.570183653 +0000 UTC m=+152.292126642" lastFinishedPulling="2026-04-17 17:28:56.737536988 +0000 UTC m=+154.459479969" observedRunningTime="2026-04-17 17:28:57.447447123 +0000 UTC m=+155.169390133" watchObservedRunningTime="2026-04-17 17:28:57.448280556 +0000 UTC m=+155.170223554" Apr 17 17:28:57.448499 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:57.448393 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" podStartSLOduration=3.448387318 podStartE2EDuration="3.448387318s" podCreationTimestamp="2026-04-17 17:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:28:55.43966703 +0000 UTC m=+153.161610065" watchObservedRunningTime="2026-04-17 17:28:57.448387318 +0000 UTC m=+155.170330316" Apr 17 17:28:57.687912 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:57.687868 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-55f58" podUID="9bd2cdcc-c4b5-446f-8f64-6c123730399d" Apr 17 17:28:57.702160 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:57.702117 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9lmqr" podUID="802564e4-cdb1-4a5c-80f9-814bd584caa0" Apr 17 17:28:57.876912 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:28:57.873776 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-p9f9z" podUID="bcb4d874-10b6-4167-b452-800ed19b3f79" Apr 17 17:28:58.434387 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:58.434300 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:28:58.434387 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:58.434305 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-55f58" Apr 17 17:28:59.694284 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:59.694234 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:59.696584 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:59.696552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kx9nb\" (UID: \"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:28:59.943079 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:28:59.943035 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" Apr 17 17:29:00.076857 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:00.076821 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb"] Apr 17 17:29:00.080246 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:29:00.080214 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc814f0cc_083d_42f2_87fb_6ac3ce3ab5bf.slice/crio-7a619776cb427d11dadfd0465c95f2a42ac2804def7a7a0d3a28ebe39d2c7f44 WatchSource:0}: Error finding container 7a619776cb427d11dadfd0465c95f2a42ac2804def7a7a0d3a28ebe39d2c7f44: Status 404 returned error can't find the container with id 7a619776cb427d11dadfd0465c95f2a42ac2804def7a7a0d3a28ebe39d2c7f44 Apr 17 17:29:00.439773 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:00.439730 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" event={"ID":"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf","Type":"ContainerStarted","Data":"7a619776cb427d11dadfd0465c95f2a42ac2804def7a7a0d3a28ebe39d2c7f44"} Apr 17 17:29:02.445734 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.445696 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" event={"ID":"c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf","Type":"ContainerStarted","Data":"15986d16da8a2e63dc1cd8f079828180a9c6695023365fd29272699a642aa373"} Apr 17 17:29:02.463896 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.463841 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kx9nb" podStartSLOduration=33.861315034 podStartE2EDuration="35.463823996s" podCreationTimestamp="2026-04-17 17:28:27 +0000 UTC" firstStartedPulling="2026-04-17 17:29:00.082540332 +0000 UTC m=+157.804483308" lastFinishedPulling="2026-04-17 17:29:01.685049278 +0000 UTC m=+159.406992270" observedRunningTime="2026-04-17 17:29:02.462711089 +0000 UTC m=+160.184654087" watchObservedRunningTime="2026-04-17 17:29:02.463823996 +0000 UTC m=+160.185766994" Apr 17 17:29:02.619483 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.619373 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:29:02.619676 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.619548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:29:02.621900 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.621868 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bd2cdcc-c4b5-446f-8f64-6c123730399d-metrics-tls\") pod \"dns-default-55f58\" (UID: \"9bd2cdcc-c4b5-446f-8f64-6c123730399d\") " pod="openshift-dns/dns-default-55f58" Apr 17 17:29:02.622039 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.621926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/802564e4-cdb1-4a5c-80f9-814bd584caa0-cert\") pod \"ingress-canary-9lmqr\" (UID: \"802564e4-cdb1-4a5c-80f9-814bd584caa0\") " pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:29:02.638788 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.638763 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dxpw7\"" Apr 17 17:29:02.638788 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.638761 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-29fk4\"" Apr 17 17:29:02.646103 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.646080 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-55f58" Apr 17 17:29:02.646160 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.646101 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9lmqr" Apr 17 17:29:02.776051 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.776018 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9lmqr"] Apr 17 17:29:02.779065 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:29:02.779024 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod802564e4_cdb1_4a5c_80f9_814bd584caa0.slice/crio-88438ae1d7e947f2ce91e5040aec11613a930e6fceb1579b35ae12ea8cc2c1c2 WatchSource:0}: Error finding container 88438ae1d7e947f2ce91e5040aec11613a930e6fceb1579b35ae12ea8cc2c1c2: Status 404 returned error can't find the container with id 88438ae1d7e947f2ce91e5040aec11613a930e6fceb1579b35ae12ea8cc2c1c2 Apr 17 17:29:02.799760 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:02.799724 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-55f58"] Apr 17 17:29:02.804085 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:29:02.804057 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bd2cdcc_c4b5_446f_8f64_6c123730399d.slice/crio-540670a1377dda851a8a81bfb30d84c3a1ff76fb7e3f7a86d8b2c76dbe2c782b WatchSource:0}: Error finding container 540670a1377dda851a8a81bfb30d84c3a1ff76fb7e3f7a86d8b2c76dbe2c782b: Status 404 returned error can't find the container with id 540670a1377dda851a8a81bfb30d84c3a1ff76fb7e3f7a86d8b2c76dbe2c782b Apr 17 17:29:03.450077 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:03.450031 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9lmqr" event={"ID":"802564e4-cdb1-4a5c-80f9-814bd584caa0","Type":"ContainerStarted","Data":"88438ae1d7e947f2ce91e5040aec11613a930e6fceb1579b35ae12ea8cc2c1c2"} Apr 17 17:29:03.451227 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:03.451192 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-55f58" event={"ID":"9bd2cdcc-c4b5-446f-8f64-6c123730399d","Type":"ContainerStarted","Data":"540670a1377dda851a8a81bfb30d84c3a1ff76fb7e3f7a86d8b2c76dbe2c782b"} Apr 17 17:29:04.455316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:04.455278 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-55f58" event={"ID":"9bd2cdcc-c4b5-446f-8f64-6c123730399d","Type":"ContainerStarted","Data":"98ce963c8c64d0db20e33a15a65b8360bacad8aef16aa5cdb54f4bb068e4d576"} Apr 17 17:29:04.455787 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:04.455342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-55f58" event={"ID":"9bd2cdcc-c4b5-446f-8f64-6c123730399d","Type":"ContainerStarted","Data":"97977d8d5f68780d3aff8bab2a914f6bf58102405722296d4088fe359bf638d0"} Apr 17 17:29:04.455787 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:04.455503 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-55f58" Apr 17 17:29:04.477437 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:04.477391 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-55f58" podStartSLOduration=129.308439563 podStartE2EDuration="2m10.477376246s" podCreationTimestamp="2026-04-17 17:26:54 +0000 UTC" firstStartedPulling="2026-04-17 17:29:02.80650976 +0000 UTC m=+160.528452737" lastFinishedPulling="2026-04-17 17:29:03.975446431 +0000 UTC m=+161.697389420" observedRunningTime="2026-04-17 17:29:04.47680675 +0000 UTC m=+162.198749760" watchObservedRunningTime="2026-04-17 17:29:04.477376246 +0000 UTC m=+162.199319243" Apr 17 17:29:05.459098 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:05.459011 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9lmqr" event={"ID":"802564e4-cdb1-4a5c-80f9-814bd584caa0","Type":"ContainerStarted","Data":"654a88fc8edf1c5af1046dad0045d2b442561dbfbe1b0cb2c969283f185ee69e"} Apr 17 17:29:05.475926 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:05.475869 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9lmqr" podStartSLOduration=129.489578178 podStartE2EDuration="2m11.475851155s" podCreationTimestamp="2026-04-17 17:26:54 +0000 UTC" firstStartedPulling="2026-04-17 17:29:02.781123946 +0000 UTC m=+160.503066923" lastFinishedPulling="2026-04-17 17:29:04.767396922 +0000 UTC m=+162.489339900" observedRunningTime="2026-04-17 17:29:05.475063972 +0000 UTC m=+163.197006973" watchObservedRunningTime="2026-04-17 17:29:05.475851155 +0000 UTC m=+163.197794152" Apr 17 17:29:09.470138 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:09.470103 2577 generic.go:358] "Generic (PLEG): container finished" podID="ecce913b-c258-42db-8465-b935c4a7b028" containerID="91f13ce438ce59ab8733a3bd137c2a85c56eb9b610b8633dc7a0830a58df4c64" exitCode=255 Apr 17 17:29:09.470508 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:09.470178 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" event={"ID":"ecce913b-c258-42db-8465-b935c4a7b028","Type":"ContainerDied","Data":"91f13ce438ce59ab8733a3bd137c2a85c56eb9b610b8633dc7a0830a58df4c64"} Apr 17 17:29:09.475820 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:09.475801 2577 scope.go:117] "RemoveContainer" containerID="91f13ce438ce59ab8733a3bd137c2a85c56eb9b610b8633dc7a0830a58df4c64" Apr 17 17:29:10.475091 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:10.475057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7754c6b749-v428h" event={"ID":"ecce913b-c258-42db-8465-b935c4a7b028","Type":"ContainerStarted","Data":"5779e53f647330e5012e20815669ee2097da0de517cf61a981faf8134d121081"} Apr 17 17:29:11.586573 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.586537 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn"] Apr 17 17:29:11.589933 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.589909 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.592459 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.592417 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 17:29:11.592459 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.592454 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qz72z\"" Apr 17 17:29:11.592658 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.592455 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:29:11.594518 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.594500 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:29:11.606231 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.606207 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn"] Apr 17 17:29:11.671198 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.671160 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-k4jf6"] Apr 17 17:29:11.674391 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.674368 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.676604 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.676579 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:29:11.676815 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.676797 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2tjqb\"" Apr 17 17:29:11.677064 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.677048 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:29:11.677372 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.677356 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:29:11.693254 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.693220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7da85e0c-7193-4788-a9ae-9bc72db222ca-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.693413 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.693269 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9l8z\" (UniqueName: \"kubernetes.io/projected/7da85e0c-7193-4788-a9ae-9bc72db222ca-kube-api-access-j9l8z\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.693413 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.693291 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7da85e0c-7193-4788-a9ae-9bc72db222ca-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.693413 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.693345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7da85e0c-7193-4788-a9ae-9bc72db222ca-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.794274 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794226 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9l8z\" (UniqueName: \"kubernetes.io/projected/7da85e0c-7193-4788-a9ae-9bc72db222ca-kube-api-access-j9l8z\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.794274 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794273 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-textfile\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.794534 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7da85e0c-7193-4788-a9ae-9bc72db222ca-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.794534 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7da85e0c-7193-4788-a9ae-9bc72db222ca-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.794534 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794334 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-wtmp\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.794534 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794351 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44ce145d-5623-4047-927e-65d3af3448da-metrics-client-ca\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.794534 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794378 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.794534 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794454 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44ce145d-5623-4047-927e-65d3af3448da-sys\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.794534 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794514 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlk8t\" (UniqueName: \"kubernetes.io/projected/44ce145d-5623-4047-927e-65d3af3448da-kube-api-access-mlk8t\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.794782 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/44ce145d-5623-4047-927e-65d3af3448da-root\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.794782 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794585 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-tls\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.794782 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-accelerators-collector-config\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.794782 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.794704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7da85e0c-7193-4788-a9ae-9bc72db222ca-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.794898 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:29:11.794797 2577 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 17:29:11.794898 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:29:11.794851 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7da85e0c-7193-4788-a9ae-9bc72db222ca-openshift-state-metrics-tls podName:7da85e0c-7193-4788-a9ae-9bc72db222ca nodeName:}" failed. No retries permitted until 2026-04-17 17:29:12.29483559 +0000 UTC m=+170.016778571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/7da85e0c-7193-4788-a9ae-9bc72db222ca-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-5m7hn" (UID: "7da85e0c-7193-4788-a9ae-9bc72db222ca") : secret "openshift-state-metrics-tls" not found Apr 17 17:29:11.795034 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.795016 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7da85e0c-7193-4788-a9ae-9bc72db222ca-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.796705 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.796686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7da85e0c-7193-4788-a9ae-9bc72db222ca-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.803328 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.803306 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9l8z\" (UniqueName: \"kubernetes.io/projected/7da85e0c-7193-4788-a9ae-9bc72db222ca-kube-api-access-j9l8z\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:11.850447 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.850360 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:29:11.895278 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44ce145d-5623-4047-927e-65d3af3448da-sys\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.895278 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlk8t\" (UniqueName: \"kubernetes.io/projected/44ce145d-5623-4047-927e-65d3af3448da-kube-api-access-mlk8t\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.895549 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/44ce145d-5623-4047-927e-65d3af3448da-root\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.895549 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895354 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/44ce145d-5623-4047-927e-65d3af3448da-root\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.895549 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-tls\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.895549 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895386 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/44ce145d-5623-4047-927e-65d3af3448da-sys\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.895549 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-accelerators-collector-config\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.895549 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:29:11.895503 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:29:11.895549 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-textfile\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.895549 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:29:11.895554 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-tls podName:44ce145d-5623-4047-927e-65d3af3448da nodeName:}" failed. No retries permitted until 2026-04-17 17:29:12.395539413 +0000 UTC m=+170.117482395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-tls") pod "node-exporter-k4jf6" (UID: "44ce145d-5623-4047-927e-65d3af3448da") : secret "node-exporter-tls" not found Apr 17 17:29:11.895904 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-wtmp\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.895904 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44ce145d-5623-4047-927e-65d3af3448da-metrics-client-ca\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.895904 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.895904 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-wtmp\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.896066 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.895973 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-textfile\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.896197 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.896175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-accelerators-collector-config\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.896238 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.896203 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44ce145d-5623-4047-927e-65d3af3448da-metrics-client-ca\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.897930 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.897914 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:11.903842 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:11.903818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlk8t\" (UniqueName: \"kubernetes.io/projected/44ce145d-5623-4047-927e-65d3af3448da-kube-api-access-mlk8t\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:12.199832 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:12.199786 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" podUID="6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.9:8000/readyz\": dial tcp 10.133.0.9:8000: connect: connection refused" Apr 17 17:29:12.298972 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:12.298925 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7da85e0c-7193-4788-a9ae-9bc72db222ca-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:12.301359 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:12.301336 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7da85e0c-7193-4788-a9ae-9bc72db222ca-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5m7hn\" (UID: \"7da85e0c-7193-4788-a9ae-9bc72db222ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:12.400210 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:12.400162 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-tls\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:12.402431 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:12.402411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/44ce145d-5623-4047-927e-65d3af3448da-node-exporter-tls\") pod \"node-exporter-k4jf6\" (UID: \"44ce145d-5623-4047-927e-65d3af3448da\") " pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:12.481615 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:12.481529 2577 generic.go:358] "Generic (PLEG): container finished" podID="6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de" containerID="c67782ef64fb8c9755b995511389496873d7c143753d9a671d26bdf9f011013c" exitCode=1 Apr 17 17:29:12.481615 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:12.481571 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" event={"ID":"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de","Type":"ContainerDied","Data":"c67782ef64fb8c9755b995511389496873d7c143753d9a671d26bdf9f011013c"} Apr 17 17:29:12.481882 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:12.481870 2577 scope.go:117] "RemoveContainer" containerID="c67782ef64fb8c9755b995511389496873d7c143753d9a671d26bdf9f011013c" Apr 17 17:29:12.501133 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:12.501102 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" Apr 17 17:29:12.585044 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:12.584935 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k4jf6" Apr 17 17:29:12.595962 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:29:12.595926 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ce145d_5623_4047_927e_65d3af3448da.slice/crio-9a4ab39bb5c52ab3ec71226dcc74080a42b2c30b9aeca61d8f9e00cf66dc6972 WatchSource:0}: Error finding container 9a4ab39bb5c52ab3ec71226dcc74080a42b2c30b9aeca61d8f9e00cf66dc6972: Status 404 returned error can't find the container with id 9a4ab39bb5c52ab3ec71226dcc74080a42b2c30b9aeca61d8f9e00cf66dc6972 Apr 17 17:29:12.643646 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:12.643605 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn"] Apr 17 17:29:12.647122 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:29:12.647087 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da85e0c_7193_4788_a9ae_9bc72db222ca.slice/crio-5dcfbb7c7a032d59b156f108128e3b98ab1c50006404d1d4d64d34008ddd88b8 WatchSource:0}: Error finding container 5dcfbb7c7a032d59b156f108128e3b98ab1c50006404d1d4d64d34008ddd88b8: Status 404 returned error can't find the container with id 5dcfbb7c7a032d59b156f108128e3b98ab1c50006404d1d4d64d34008ddd88b8 Apr 17 17:29:13.487201 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:13.487161 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" event={"ID":"7da85e0c-7193-4788-a9ae-9bc72db222ca","Type":"ContainerStarted","Data":"4a27d445ec8b886f89d3603a9983bafa0c517ce136893dcb6defadf85045385d"} Apr 17 17:29:13.487201 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:13.487209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" event={"ID":"7da85e0c-7193-4788-a9ae-9bc72db222ca","Type":"ContainerStarted","Data":"96f8e4f8fcea2bbd0019d93ec5537e67676375758f6c340d5a3322ca0e1df1d9"} Apr 17 17:29:13.487486 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:13.487223 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" event={"ID":"7da85e0c-7193-4788-a9ae-9bc72db222ca","Type":"ContainerStarted","Data":"5dcfbb7c7a032d59b156f108128e3b98ab1c50006404d1d4d64d34008ddd88b8"} Apr 17 17:29:13.488741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:13.488708 2577 generic.go:358] "Generic (PLEG): container finished" podID="44ce145d-5623-4047-927e-65d3af3448da" containerID="e8b089bc583ff9995a47fdec02295e3c4ac2850206db015316a0b2d371de2bd4" exitCode=0 Apr 17 17:29:13.488879 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:13.488796 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4jf6" event={"ID":"44ce145d-5623-4047-927e-65d3af3448da","Type":"ContainerDied","Data":"e8b089bc583ff9995a47fdec02295e3c4ac2850206db015316a0b2d371de2bd4"} Apr 17 17:29:13.488879 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:13.488844 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4jf6" event={"ID":"44ce145d-5623-4047-927e-65d3af3448da","Type":"ContainerStarted","Data":"9a4ab39bb5c52ab3ec71226dcc74080a42b2c30b9aeca61d8f9e00cf66dc6972"} Apr 17 17:29:13.490487 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:13.490449 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" event={"ID":"6e9408b6-5cf2-4fc7-bb52-f5f36b2c35de","Type":"ContainerStarted","Data":"8a077a432050bec4f4b1717d76a274f4f52b15a5c22c9bf56322a5fe9e732cd7"} Apr 17 17:29:13.491073 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:13.491050 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:29:13.491645 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:13.491616 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f45b57878-7zrzx" Apr 17 17:29:14.461811 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:14.461777 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-55f58" Apr 17 17:29:14.474877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:14.474841 2577 patch_prober.go:28] interesting pod/image-registry-5b45f68c89-wvxkd container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:29:14.474969 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:14.474903 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" podUID="d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:29:14.496527 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:14.496491 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" event={"ID":"7da85e0c-7193-4788-a9ae-9bc72db222ca","Type":"ContainerStarted","Data":"03db95c2d4665ddce8c920dd52ef4742dcc0b465d9a20081223a082f6b1ee3ee"} Apr 17 17:29:14.498491 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:14.498448 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4jf6" event={"ID":"44ce145d-5623-4047-927e-65d3af3448da","Type":"ContainerStarted","Data":"b0d295d36a1181accf4af61763744f6faadcba2ec7ea7805301ede7b2fc5145a"} Apr 17 17:29:14.498620 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:14.498496 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k4jf6" event={"ID":"44ce145d-5623-4047-927e-65d3af3448da","Type":"ContainerStarted","Data":"35cf93cffe351ab67762aa4f6f25f991bdfbd450ca78e4430f34ef0002c1735d"} Apr 17 17:29:14.524131 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:14.524046 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5m7hn" podStartSLOduration=2.402563895 podStartE2EDuration="3.524028493s" podCreationTimestamp="2026-04-17 17:29:11 +0000 UTC" firstStartedPulling="2026-04-17 17:29:12.780253771 +0000 UTC m=+170.502196750" lastFinishedPulling="2026-04-17 17:29:13.901718368 +0000 UTC m=+171.623661348" observedRunningTime="2026-04-17 17:29:14.523583574 +0000 UTC m=+172.245526572" watchObservedRunningTime="2026-04-17 17:29:14.524028493 +0000 UTC m=+172.245971492" Apr 17 17:29:14.540537 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:14.540463 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-k4jf6" podStartSLOduration=2.890188242 podStartE2EDuration="3.54044521s" podCreationTimestamp="2026-04-17 17:29:11 +0000 UTC" firstStartedPulling="2026-04-17 17:29:12.599036332 +0000 UTC m=+170.320979323" lastFinishedPulling="2026-04-17 17:29:13.249293311 +0000 UTC m=+170.971236291" observedRunningTime="2026-04-17 17:29:14.539798665 +0000 UTC m=+172.261741675" watchObservedRunningTime="2026-04-17 17:29:14.54044521 +0000 UTC m=+172.262388213" Apr 17 17:29:16.392913 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:16.392871 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm"] Apr 17 17:29:16.396363 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:16.396342 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm" Apr 17 17:29:16.398399 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:16.398376 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 17:29:16.398545 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:16.398438 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-mt8cv\"" Apr 17 17:29:16.404265 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:16.404237 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm"] Apr 17 17:29:16.429942 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:16.429915 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5b45f68c89-wvxkd" Apr 17 17:29:16.435788 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:16.435762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc8d4458-96ea-4eb9-9628-355967102e97-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-pdbfm\" (UID: \"dc8d4458-96ea-4eb9-9628-355967102e97\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm" Apr 17 17:29:16.536672 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:16.536616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc8d4458-96ea-4eb9-9628-355967102e97-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-pdbfm\" (UID: \"dc8d4458-96ea-4eb9-9628-355967102e97\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm" Apr 17 17:29:16.536955 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:29:16.536931 2577 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 17:29:16.537039 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:29:16.537005 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc8d4458-96ea-4eb9-9628-355967102e97-monitoring-plugin-cert podName:dc8d4458-96ea-4eb9-9628-355967102e97 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:17.036986712 +0000 UTC m=+174.758929688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/dc8d4458-96ea-4eb9-9628-355967102e97-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-pdbfm" (UID: "dc8d4458-96ea-4eb9-9628-355967102e97") : secret "monitoring-plugin-cert" not found Apr 17 17:29:17.041046 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.041002 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc8d4458-96ea-4eb9-9628-355967102e97-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-pdbfm\" (UID: \"dc8d4458-96ea-4eb9-9628-355967102e97\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm" Apr 17 17:29:17.043449 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.043407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc8d4458-96ea-4eb9-9628-355967102e97-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-pdbfm\" (UID: \"dc8d4458-96ea-4eb9-9628-355967102e97\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm" Apr 17 17:29:17.306668 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.306559 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm" Apr 17 17:29:17.445985 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.445948 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm"] Apr 17 17:29:17.450967 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:29:17.450932 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc8d4458_96ea_4eb9_9628_355967102e97.slice/crio-66815b637ad75510da8405e84e0f82685f30517ed0e6a49cd38ce4746f137201 WatchSource:0}: Error finding container 66815b637ad75510da8405e84e0f82685f30517ed0e6a49cd38ce4746f137201: Status 404 returned error can't find the container with id 66815b637ad75510da8405e84e0f82685f30517ed0e6a49cd38ce4746f137201 Apr 17 17:29:17.507665 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.507628 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm" event={"ID":"dc8d4458-96ea-4eb9-9628-355967102e97","Type":"ContainerStarted","Data":"66815b637ad75510da8405e84e0f82685f30517ed0e6a49cd38ce4746f137201"} Apr 17 17:29:17.846900 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.846868 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:29:17.851982 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.851958 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.854822 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.854791 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:29:17.854958 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.854879 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 17:29:17.854958 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.854889 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-lv48b\"" Apr 17 17:29:17.854958 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.854878 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 17:29:17.855111 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.854929 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 17:29:17.855302 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.855270 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 17:29:17.855423 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.855273 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 17:29:17.855423 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.855408 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 17:29:17.855583 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.855423 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 17:29:17.855583 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.855507 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 17:29:17.855726 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.855706 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 17:29:17.855783 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.855736 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 17:29:17.855855 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.855837 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-deb0qjhkiqmj1\"" Apr 17 17:29:17.856041 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.856025 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 17:29:17.857222 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.857204 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 17:29:17.863781 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.863757 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:29:17.950013 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.949972 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950013 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950235 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-web-config\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950235 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950102 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950235 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-config\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950235 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950160 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950235 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950176 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn7qs\" (UniqueName: \"kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-kube-api-access-xn7qs\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950235 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950222 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950432 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950291 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-config-out\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950432 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950318 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950432 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950432 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950349 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950432 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950368 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950432 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950432 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950426 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950757 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950507 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950757 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:17.950757 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:17.950585 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.051492 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.051492 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-config\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.051741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.051741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn7qs\" (UniqueName: \"kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-kube-api-access-xn7qs\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.051741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051555 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.051741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-config-out\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.051741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051650 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.051741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051676 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.051741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.051741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051723 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.052172 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.052172 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051795 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.052172 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.052172 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.052172 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051887 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.052172 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.052172 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051946 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.052172 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.051989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-web-config\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.052584 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.052487 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.055731 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.055435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-config\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.056261 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.055934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.056261 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.056054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.056410 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.056375 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.057312 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.057285 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.057878 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.057856 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.058575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.058340 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.058670 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.058591 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-config-out\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.058670 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.058644 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.059334 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.059312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.061007 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.060112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.061250 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.061228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-web-config\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.061385 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.061297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.061520 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.061305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn7qs\" (UniqueName: \"kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-kube-api-access-xn7qs\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.063044 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.063022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.063359 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.063335 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.064127 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.064110 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.162153 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.162057 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:29:18.308282 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.308239 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:29:18.312481 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:29:18.312424 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8749fd2_aa4c_421a_a967_fbec8fced636.slice/crio-25a28495b791413968a954de35c72d8c022c85904bb37dfd2334a7cc639b608b WatchSource:0}: Error finding container 25a28495b791413968a954de35c72d8c022c85904bb37dfd2334a7cc639b608b: Status 404 returned error can't find the container with id 25a28495b791413968a954de35c72d8c022c85904bb37dfd2334a7cc639b608b Apr 17 17:29:18.512394 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:18.512353 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerStarted","Data":"25a28495b791413968a954de35c72d8c022c85904bb37dfd2334a7cc639b608b"} Apr 17 17:29:19.516079 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:19.516036 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm" event={"ID":"dc8d4458-96ea-4eb9-9628-355967102e97","Type":"ContainerStarted","Data":"c21ccb874da3111b147891a6f338cbd818ef9acb779ea5af3a72b4f2e7b568bb"} Apr 17 17:29:19.516544 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:19.516267 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm" Apr 17 17:29:19.517609 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:19.517581 2577 generic.go:358] "Generic (PLEG): container finished" podID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerID="9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187" exitCode=0 Apr 17 17:29:19.517707 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:19.517660 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerDied","Data":"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187"} Apr 17 17:29:19.521647 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:19.521623 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm" Apr 17 17:29:19.531186 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:19.531137 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-pdbfm" podStartSLOduration=2.173505302 podStartE2EDuration="3.53112352s" podCreationTimestamp="2026-04-17 17:29:16 +0000 UTC" firstStartedPulling="2026-04-17 17:29:17.452948204 +0000 UTC m=+175.174891181" lastFinishedPulling="2026-04-17 17:29:18.810566406 +0000 UTC m=+176.532509399" observedRunningTime="2026-04-17 17:29:19.530651456 +0000 UTC m=+177.252594453" watchObservedRunningTime="2026-04-17 17:29:19.53112352 +0000 UTC m=+177.253066518" Apr 17 17:29:22.533244 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:22.533208 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerStarted","Data":"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8"} Apr 17 17:29:22.533244 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:22.533246 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerStarted","Data":"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de"} Apr 17 17:29:24.542431 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:24.542342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerStarted","Data":"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9"} Apr 17 17:29:24.542431 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:24.542378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerStarted","Data":"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8"} Apr 17 17:29:24.542431 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:24.542388 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerStarted","Data":"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39"} Apr 17 17:29:24.542431 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:24.542397 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerStarted","Data":"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c"} Apr 17 17:29:24.570704 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:24.570640 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.864465274 podStartE2EDuration="7.570626651s" podCreationTimestamp="2026-04-17 17:29:17 +0000 UTC" firstStartedPulling="2026-04-17 17:29:18.314814149 +0000 UTC m=+176.036757129" lastFinishedPulling="2026-04-17 17:29:24.020975515 +0000 UTC m=+181.742918506" observedRunningTime="2026-04-17 17:29:24.569824615 +0000 UTC m=+182.291767637" watchObservedRunningTime="2026-04-17 17:29:24.570626651 +0000 UTC m=+182.292569648" Apr 17 17:29:28.162953 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:29:28.162910 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:18.162827 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:18.162779 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:18.178814 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:18.178784 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:18.713494 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:18.713440 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:33.703613 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:33.703573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:30:33.706304 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:33.706265 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bcb4d874-10b6-4167-b452-800ed19b3f79-metrics-certs\") pod \"network-metrics-daemon-p9f9z\" (UID: \"bcb4d874-10b6-4167-b452-800ed19b3f79\") " pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:30:33.753898 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:33.753869 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q7kb6\"" Apr 17 17:30:33.761787 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:33.761756 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9f9z" Apr 17 17:30:33.882444 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:33.882411 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p9f9z"] Apr 17 17:30:33.886165 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:30:33.886135 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcb4d874_10b6_4167_b452_800ed19b3f79.slice/crio-b712edebf27b10052b8cf9ad1c0ce6182463106d185be976a4a9ead43239fbd0 WatchSource:0}: Error finding container b712edebf27b10052b8cf9ad1c0ce6182463106d185be976a4a9ead43239fbd0: Status 404 returned error can't find the container with id b712edebf27b10052b8cf9ad1c0ce6182463106d185be976a4a9ead43239fbd0 Apr 17 17:30:34.745675 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:34.745626 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p9f9z" event={"ID":"bcb4d874-10b6-4167-b452-800ed19b3f79","Type":"ContainerStarted","Data":"b712edebf27b10052b8cf9ad1c0ce6182463106d185be976a4a9ead43239fbd0"} Apr 17 17:30:35.750316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:35.750281 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p9f9z" event={"ID":"bcb4d874-10b6-4167-b452-800ed19b3f79","Type":"ContainerStarted","Data":"8d78eaf2a9709b21df8315887a67c82eb00384d952226e90b3e944adc170020f"} Apr 17 17:30:35.750316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:35.750319 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p9f9z" event={"ID":"bcb4d874-10b6-4167-b452-800ed19b3f79","Type":"ContainerStarted","Data":"56946f405cc10788e289246fa262b6fa872d70606131e7a7f2d8e35d4b058472"} Apr 17 17:30:35.766682 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:35.766629 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-p9f9z" podStartSLOduration=252.689338388 podStartE2EDuration="4m13.766612575s" podCreationTimestamp="2026-04-17 17:26:22 +0000 UTC" firstStartedPulling="2026-04-17 17:30:33.888146641 +0000 UTC m=+251.610089617" lastFinishedPulling="2026-04-17 17:30:34.965420824 +0000 UTC m=+252.687363804" observedRunningTime="2026-04-17 17:30:35.765714311 +0000 UTC m=+253.487657310" watchObservedRunningTime="2026-04-17 17:30:35.766612575 +0000 UTC m=+253.488555572" Apr 17 17:30:36.293445 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.293400 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:30:36.294191 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.294031 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="prometheus" containerID="cri-o://9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de" gracePeriod=600 Apr 17 17:30:36.294191 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.294111 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="config-reloader" containerID="cri-o://26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8" gracePeriod=600 Apr 17 17:30:36.294191 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.294048 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy" containerID="cri-o://9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8" gracePeriod=600 Apr 17 17:30:36.294191 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.294128 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy-thanos" containerID="cri-o://9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9" gracePeriod=600 Apr 17 17:30:36.294535 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.294056 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="thanos-sidecar" containerID="cri-o://efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c" gracePeriod=600 Apr 17 17:30:36.294535 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.294117 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy-web" containerID="cri-o://fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39" gracePeriod=600 Apr 17 17:30:36.529731 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.529704 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.626991 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.626898 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-config\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.626991 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.626940 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-db\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.626991 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.626968 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-config-out\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.626991 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.626992 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-metrics-client-ca\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627315 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627022 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627315 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627051 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627315 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627078 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-metrics-client-certs\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627509 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627465 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-tls-assets\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627596 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627488 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:30:36.627596 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627544 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-thanos-prometheus-http-client-file\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627596 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627576 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-kubelet-serving-ca-bundle\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627608 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-trusted-ca-bundle\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627663 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-tls\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627741 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627693 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-kube-rbac-proxy\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627893 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627747 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-grpc-tls\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627893 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627796 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-serving-certs-ca-bundle\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627893 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627820 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-rulefiles-0\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627893 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627855 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-web-config\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.627893 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.627879 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn7qs\" (UniqueName: \"kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-kube-api-access-xn7qs\") pod \"e8749fd2-aa4c-421a-a967-fbec8fced636\" (UID: \"e8749fd2-aa4c-421a-a967-fbec8fced636\") " Apr 17 17:30:36.628135 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.628085 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:30:36.628135 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.628115 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-metrics-client-ca\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.628932 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.628600 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:30:36.628932 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.628815 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:30:36.629288 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.629244 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:30:36.630147 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.630119 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:30:36.630397 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.630371 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-config-out" (OuterVolumeSpecName: "config-out") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:30:36.630519 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.630368 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:36.630519 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.630371 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:30:36.630909 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.630884 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:36.631242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.631186 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:36.631242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.631209 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-config" (OuterVolumeSpecName: "config") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:36.631242 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.631226 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:36.631889 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.631866 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:36.631971 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.631958 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:36.632124 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.632101 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-kube-api-access-xn7qs" (OuterVolumeSpecName: "kube-api-access-xn7qs") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "kube-api-access-xn7qs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:30:36.632209 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.632175 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:36.642287 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.642257 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-web-config" (OuterVolumeSpecName: "web-config") pod "e8749fd2-aa4c-421a-a967-fbec8fced636" (UID: "e8749fd2-aa4c-421a-a967-fbec8fced636"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:36.729164 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729135 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-web-config\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729164 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729165 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xn7qs\" (UniqueName: \"kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-kube-api-access-xn7qs\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729177 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-config\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729189 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-db\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729199 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e8749fd2-aa4c-421a-a967-fbec8fced636-config-out\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729209 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729218 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729228 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-metrics-client-certs\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729236 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e8749fd2-aa4c-421a-a967-fbec8fced636-tls-assets\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729245 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-thanos-prometheus-http-client-file\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729255 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729263 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-trusted-ca-bundle\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729272 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-prometheus-k8s-tls\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729282 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-kube-rbac-proxy\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729290 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e8749fd2-aa4c-421a-a967-fbec8fced636-secret-grpc-tls\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729298 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.729316 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.729307 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e8749fd2-aa4c-421a-a967-fbec8fced636-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:30:36.756527 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756494 2577 generic.go:358] "Generic (PLEG): container finished" podID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerID="9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9" exitCode=0 Apr 17 17:30:36.756527 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756518 2577 generic.go:358] "Generic (PLEG): container finished" podID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerID="9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8" exitCode=0 Apr 17 17:30:36.756527 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756525 2577 generic.go:358] "Generic (PLEG): container finished" podID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerID="fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39" exitCode=0 Apr 17 17:30:36.756527 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756533 2577 generic.go:358] "Generic (PLEG): container finished" podID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerID="efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c" exitCode=0 Apr 17 17:30:36.757052 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756539 2577 generic.go:358] "Generic (PLEG): container finished" podID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerID="26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8" exitCode=0 Apr 17 17:30:36.757052 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756546 2577 generic.go:358] "Generic (PLEG): container finished" podID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerID="9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de" exitCode=0 Apr 17 17:30:36.757052 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756635 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerDied","Data":"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9"} Apr 17 17:30:36.757052 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756657 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.757052 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756675 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerDied","Data":"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8"} Apr 17 17:30:36.757052 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerDied","Data":"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39"} Apr 17 17:30:36.757052 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756702 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerDied","Data":"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c"} Apr 17 17:30:36.757052 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756716 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerDied","Data":"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8"} Apr 17 17:30:36.757052 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756730 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerDied","Data":"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de"} Apr 17 17:30:36.757052 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e8749fd2-aa4c-421a-a967-fbec8fced636","Type":"ContainerDied","Data":"25a28495b791413968a954de35c72d8c022c85904bb37dfd2334a7cc639b608b"} Apr 17 17:30:36.757052 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.756753 2577 scope.go:117] "RemoveContainer" containerID="9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9" Apr 17 17:30:36.764935 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.764910 2577 scope.go:117] "RemoveContainer" containerID="9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8" Apr 17 17:30:36.772257 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.772238 2577 scope.go:117] "RemoveContainer" containerID="fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39" Apr 17 17:30:36.779087 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.779064 2577 scope.go:117] "RemoveContainer" containerID="efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c" Apr 17 17:30:36.781888 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.781863 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:30:36.785545 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.785517 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:30:36.787163 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.787146 2577 scope.go:117] "RemoveContainer" containerID="26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8" Apr 17 17:30:36.794374 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.794352 2577 scope.go:117] "RemoveContainer" containerID="9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de" Apr 17 17:30:36.801542 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.801520 2577 scope.go:117] "RemoveContainer" containerID="9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187" Apr 17 17:30:36.808252 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.808232 2577 scope.go:117] "RemoveContainer" containerID="9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9" Apr 17 17:30:36.808555 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:30:36.808535 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": container with ID starting with 9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9 not found: ID does not exist" containerID="9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9" Apr 17 17:30:36.808628 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.808569 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9"} err="failed to get container status \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": rpc error: code = NotFound desc = could not find container \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": container with ID starting with 9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9 not found: ID does not exist" Apr 17 17:30:36.808628 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.808619 2577 scope.go:117] "RemoveContainer" containerID="9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8" Apr 17 17:30:36.808891 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:30:36.808874 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": container with ID starting with 9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8 not found: ID does not exist" containerID="9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8" Apr 17 17:30:36.808933 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.808900 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8"} err="failed to get container status \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": rpc error: code = NotFound desc = could not find container \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": container with ID starting with 9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8 not found: ID does not exist" Apr 17 17:30:36.808933 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.808918 2577 scope.go:117] "RemoveContainer" containerID="fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39" Apr 17 17:30:36.809171 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:30:36.809142 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": container with ID starting with fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39 not found: ID does not exist" containerID="fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39" Apr 17 17:30:36.809220 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.809166 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39"} err="failed to get container status \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": rpc error: code = NotFound desc = could not find container \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": container with ID starting with fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39 not found: ID does not exist" Apr 17 17:30:36.809220 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.809181 2577 scope.go:117] "RemoveContainer" containerID="efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c" Apr 17 17:30:36.809367 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:30:36.809352 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": container with ID starting with efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c not found: ID does not exist" containerID="efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c" Apr 17 17:30:36.809424 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.809370 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c"} err="failed to get container status \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": rpc error: code = NotFound desc = could not find container \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": container with ID starting with efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c not found: ID does not exist" Apr 17 17:30:36.809424 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.809382 2577 scope.go:117] "RemoveContainer" containerID="26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8" Apr 17 17:30:36.809633 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:30:36.809616 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": container with ID starting with 26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8 not found: ID does not exist" containerID="26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8" Apr 17 17:30:36.809678 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.809636 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8"} err="failed to get container status \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": rpc error: code = NotFound desc = could not find container \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": container with ID starting with 26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8 not found: ID does not exist" Apr 17 17:30:36.809678 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.809649 2577 scope.go:117] "RemoveContainer" containerID="9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de" Apr 17 17:30:36.809872 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:30:36.809858 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": container with ID starting with 9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de not found: ID does not exist" containerID="9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de" Apr 17 17:30:36.809910 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.809876 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de"} err="failed to get container status \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": rpc error: code = NotFound desc = could not find container \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": container with ID starting with 9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de not found: ID does not exist" Apr 17 17:30:36.809910 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.809893 2577 scope.go:117] "RemoveContainer" containerID="9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187" Apr 17 17:30:36.810141 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:30:36.810120 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": container with ID starting with 9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187 not found: ID does not exist" containerID="9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187" Apr 17 17:30:36.810197 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.810146 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187"} err="failed to get container status \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": rpc error: code = NotFound desc = could not find container \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": container with ID starting with 9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187 not found: ID does not exist" Apr 17 17:30:36.810197 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.810170 2577 scope.go:117] "RemoveContainer" containerID="9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9" Apr 17 17:30:36.810351 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.810334 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9"} err="failed to get container status \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": rpc error: code = NotFound desc = could not find container \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": container with ID starting with 9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9 not found: ID does not exist" Apr 17 17:30:36.810391 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.810351 2577 scope.go:117] "RemoveContainer" containerID="9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8" Apr 17 17:30:36.810592 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.810560 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8"} err="failed to get container status \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": rpc error: code = NotFound desc = could not find container \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": container with ID starting with 9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8 not found: ID does not exist" Apr 17 17:30:36.810592 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.810592 2577 scope.go:117] "RemoveContainer" containerID="fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39" Apr 17 17:30:36.810740 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.810724 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39"} err="failed to get container status \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": rpc error: code = NotFound desc = could not find container \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": container with ID starting with fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39 not found: ID does not exist" Apr 17 17:30:36.810788 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.810740 2577 scope.go:117] "RemoveContainer" containerID="efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c" Apr 17 17:30:36.810969 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.810940 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c"} err="failed to get container status \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": rpc error: code = NotFound desc = could not find container \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": container with ID starting with efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c not found: ID does not exist" Apr 17 17:30:36.811027 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.810969 2577 scope.go:117] "RemoveContainer" containerID="26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8" Apr 17 17:30:36.811169 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.811153 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8"} err="failed to get container status \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": rpc error: code = NotFound desc = could not find container \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": container with ID starting with 26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8 not found: ID does not exist" Apr 17 17:30:36.811218 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.811170 2577 scope.go:117] "RemoveContainer" containerID="9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de" Apr 17 17:30:36.811355 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.811340 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de"} err="failed to get container status \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": rpc error: code = NotFound desc = could not find container \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": container with ID starting with 9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de not found: ID does not exist" Apr 17 17:30:36.811402 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.811355 2577 scope.go:117] "RemoveContainer" containerID="9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187" Apr 17 17:30:36.811523 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.811508 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187"} err="failed to get container status \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": rpc error: code = NotFound desc = could not find container \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": container with ID starting with 9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187 not found: ID does not exist" Apr 17 17:30:36.811576 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.811524 2577 scope.go:117] "RemoveContainer" containerID="9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9" Apr 17 17:30:36.811707 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.811691 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9"} err="failed to get container status \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": rpc error: code = NotFound desc = could not find container \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": container with ID starting with 9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9 not found: ID does not exist" Apr 17 17:30:36.811746 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.811707 2577 scope.go:117] "RemoveContainer" containerID="9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8" Apr 17 17:30:36.811891 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.811875 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8"} err="failed to get container status \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": rpc error: code = NotFound desc = could not find container \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": container with ID starting with 9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8 not found: ID does not exist" Apr 17 17:30:36.811934 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.811891 2577 scope.go:117] "RemoveContainer" containerID="fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39" Apr 17 17:30:36.812067 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.812053 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39"} err="failed to get container status \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": rpc error: code = NotFound desc = could not find container \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": container with ID starting with fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39 not found: ID does not exist" Apr 17 17:30:36.812111 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.812068 2577 scope.go:117] "RemoveContainer" containerID="efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c" Apr 17 17:30:36.812260 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.812245 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c"} err="failed to get container status \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": rpc error: code = NotFound desc = could not find container \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": container with ID starting with efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c not found: ID does not exist" Apr 17 17:30:36.812260 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.812259 2577 scope.go:117] "RemoveContainer" containerID="26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8" Apr 17 17:30:36.812444 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.812428 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8"} err="failed to get container status \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": rpc error: code = NotFound desc = could not find container \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": container with ID starting with 26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8 not found: ID does not exist" Apr 17 17:30:36.812502 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.812445 2577 scope.go:117] "RemoveContainer" containerID="9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de" Apr 17 17:30:36.812634 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.812612 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de"} err="failed to get container status \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": rpc error: code = NotFound desc = could not find container \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": container with ID starting with 9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de not found: ID does not exist" Apr 17 17:30:36.812701 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.812636 2577 scope.go:117] "RemoveContainer" containerID="9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187" Apr 17 17:30:36.812854 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.812839 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187"} err="failed to get container status \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": rpc error: code = NotFound desc = could not find container \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": container with ID starting with 9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187 not found: ID does not exist" Apr 17 17:30:36.812907 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.812854 2577 scope.go:117] "RemoveContainer" containerID="9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9" Apr 17 17:30:36.813051 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.813034 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9"} err="failed to get container status \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": rpc error: code = NotFound desc = could not find container \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": container with ID starting with 9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9 not found: ID does not exist" Apr 17 17:30:36.813092 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.813051 2577 scope.go:117] "RemoveContainer" containerID="9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8" Apr 17 17:30:36.813244 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.813229 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8"} err="failed to get container status \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": rpc error: code = NotFound desc = could not find container \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": container with ID starting with 9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8 not found: ID does not exist" Apr 17 17:30:36.813244 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.813244 2577 scope.go:117] "RemoveContainer" containerID="fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39" Apr 17 17:30:36.813450 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.813435 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39"} err="failed to get container status \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": rpc error: code = NotFound desc = could not find container \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": container with ID starting with fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39 not found: ID does not exist" Apr 17 17:30:36.813512 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.813450 2577 scope.go:117] "RemoveContainer" containerID="efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c" Apr 17 17:30:36.813692 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.813676 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c"} err="failed to get container status \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": rpc error: code = NotFound desc = could not find container \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": container with ID starting with efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c not found: ID does not exist" Apr 17 17:30:36.813747 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.813693 2577 scope.go:117] "RemoveContainer" containerID="26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8" Apr 17 17:30:36.813858 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.813843 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8"} err="failed to get container status \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": rpc error: code = NotFound desc = could not find container \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": container with ID starting with 26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8 not found: ID does not exist" Apr 17 17:30:36.813901 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.813858 2577 scope.go:117] "RemoveContainer" containerID="9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de" Apr 17 17:30:36.814050 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.814033 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de"} err="failed to get container status \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": rpc error: code = NotFound desc = could not find container \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": container with ID starting with 9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de not found: ID does not exist" Apr 17 17:30:36.814050 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.814049 2577 scope.go:117] "RemoveContainer" containerID="9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187" Apr 17 17:30:36.814217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.814202 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187"} err="failed to get container status \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": rpc error: code = NotFound desc = could not find container \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": container with ID starting with 9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187 not found: ID does not exist" Apr 17 17:30:36.814258 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.814217 2577 scope.go:117] "RemoveContainer" containerID="9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9" Apr 17 17:30:36.814381 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.814360 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9"} err="failed to get container status \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": rpc error: code = NotFound desc = could not find container \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": container with ID starting with 9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9 not found: ID does not exist" Apr 17 17:30:36.814426 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.814380 2577 scope.go:117] "RemoveContainer" containerID="9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8" Apr 17 17:30:36.814689 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.814663 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8"} err="failed to get container status \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": rpc error: code = NotFound desc = could not find container \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": container with ID starting with 9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8 not found: ID does not exist" Apr 17 17:30:36.814758 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.814690 2577 scope.go:117] "RemoveContainer" containerID="fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39" Apr 17 17:30:36.815019 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.814993 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39"} err="failed to get container status \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": rpc error: code = NotFound desc = could not find container \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": container with ID starting with fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39 not found: ID does not exist" Apr 17 17:30:36.815019 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.815020 2577 scope.go:117] "RemoveContainer" containerID="efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c" Apr 17 17:30:36.815264 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.815240 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c"} err="failed to get container status \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": rpc error: code = NotFound desc = could not find container \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": container with ID starting with efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c not found: ID does not exist" Apr 17 17:30:36.815343 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.815265 2577 scope.go:117] "RemoveContainer" containerID="26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8" Apr 17 17:30:36.815754 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.815728 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8"} err="failed to get container status \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": rpc error: code = NotFound desc = could not find container \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": container with ID starting with 26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8 not found: ID does not exist" Apr 17 17:30:36.815754 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.815755 2577 scope.go:117] "RemoveContainer" containerID="9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de" Apr 17 17:30:36.816021 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.815989 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de"} err="failed to get container status \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": rpc error: code = NotFound desc = could not find container \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": container with ID starting with 9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de not found: ID does not exist" Apr 17 17:30:36.816021 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816011 2577 scope.go:117] "RemoveContainer" containerID="9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187" Apr 17 17:30:36.816312 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816292 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187"} err="failed to get container status \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": rpc error: code = NotFound desc = could not find container \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": container with ID starting with 9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187 not found: ID does not exist" Apr 17 17:30:36.816312 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816312 2577 scope.go:117] "RemoveContainer" containerID="9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9" Apr 17 17:30:36.816601 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816584 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:30:36.816601 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816581 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9"} err="failed to get container status \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": rpc error: code = NotFound desc = could not find container \"9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9\": container with ID starting with 9f0d34756856a983edb79c078eb3f4273a92aa7fc89f1fee26a6dd3f136d55b9 not found: ID does not exist" Apr 17 17:30:36.816700 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816608 2577 scope.go:117] "RemoveContainer" containerID="9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8" Apr 17 17:30:36.816859 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816840 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8"} err="failed to get container status \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": rpc error: code = NotFound desc = could not find container \"9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8\": container with ID starting with 9fac43eec2ee42ceea92704561311a2a34a962be9236d41cd615acfd81c6bac8 not found: ID does not exist" Apr 17 17:30:36.816906 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816869 2577 scope.go:117] "RemoveContainer" containerID="fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39" Apr 17 17:30:36.816945 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816922 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="prometheus" Apr 17 17:30:36.816945 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816935 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="prometheus" Apr 17 17:30:36.817015 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816945 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="thanos-sidecar" Apr 17 17:30:36.817015 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816954 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="thanos-sidecar" Apr 17 17:30:36.817015 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816962 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy" Apr 17 17:30:36.817015 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816971 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy" Apr 17 17:30:36.817015 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816982 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="init-config-reloader" Apr 17 17:30:36.817015 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816991 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="init-config-reloader" Apr 17 17:30:36.817015 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.816999 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy-web" Apr 17 17:30:36.817015 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817007 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy-web" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817023 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy-thanos" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817031 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy-thanos" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817040 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="config-reloader" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817047 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="config-reloader" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817055 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39"} err="failed to get container status \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": rpc error: code = NotFound desc = could not find container \"fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39\": container with ID starting with fa9a1c5575844c8144b588d8357ea9669bceffa45165925dda1cc58d6871af39 not found: ID does not exist" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817072 2577 scope.go:117] "RemoveContainer" containerID="efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817108 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy-web" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817122 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="config-reloader" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817128 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="thanos-sidecar" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817134 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="prometheus" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817140 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy-thanos" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817148 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" containerName="kube-rbac-proxy" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817271 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c"} err="failed to get container status \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": rpc error: code = NotFound desc = could not find container \"efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c\": container with ID starting with efd082c4ecbb35495455d9059005313ce35685a6462a8bc2e675ac5022f0d13c not found: ID does not exist" Apr 17 17:30:36.817307 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817292 2577 scope.go:117] "RemoveContainer" containerID="26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8" Apr 17 17:30:36.817793 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817542 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8"} err="failed to get container status \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": rpc error: code = NotFound desc = could not find container \"26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8\": container with ID starting with 26096736897e4de524d68470a8b3bf28796450c5af19743e6f1da170b693c8d8 not found: ID does not exist" Apr 17 17:30:36.817793 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817573 2577 scope.go:117] "RemoveContainer" containerID="9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de" Apr 17 17:30:36.817793 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817758 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de"} err="failed to get container status \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": rpc error: code = NotFound desc = could not find container \"9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de\": container with ID starting with 9d70611ad82f22e9a9606f76c0baf2de9d43a99e4a594103d422df8a986396de not found: ID does not exist" Apr 17 17:30:36.817793 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.817776 2577 scope.go:117] "RemoveContainer" containerID="9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187" Apr 17 17:30:36.818025 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.818005 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187"} err="failed to get container status \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": rpc error: code = NotFound desc = could not find container \"9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187\": container with ID starting with 9605628a61324dee99e962e9c09ad3fcaf9d025114826f92ce964dde2a076187 not found: ID does not exist" Apr 17 17:30:36.821014 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.820998 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.823570 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.823548 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 17:30:36.823672 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.823553 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 17:30:36.823791 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.823749 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 17:30:36.823791 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.823774 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 17:30:36.823947 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.823855 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:30:36.824008 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.823986 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-deb0qjhkiqmj1\"" Apr 17 17:30:36.824066 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.824025 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 17:30:36.824066 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.824051 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 17:30:36.824181 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.823987 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 17:30:36.824181 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.824148 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 17:30:36.824296 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.824235 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-lv48b\"" Apr 17 17:30:36.824396 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.824378 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 17:30:36.824457 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.824395 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 17:30:36.826371 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.826350 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 17:30:36.829248 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829230 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 17:30:36.829593 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.829644 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.829644 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829620 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7hbl\" (UniqueName: \"kubernetes.io/projected/12c31a5c-562a-4116-a291-b8c4a68e7208-kube-api-access-v7hbl\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.829736 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829656 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.829736 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/12c31a5c-562a-4116-a291-b8c4a68e7208-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.829839 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829732 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-config\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.829839 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12c31a5c-562a-4116-a291-b8c4a68e7208-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.829926 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.829926 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.830037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829931 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-web-config\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.830037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.830037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.829989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12c31a5c-562a-4116-a291-b8c4a68e7208-config-out\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.830037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.830007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.830037 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.830034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.830217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.830110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.830217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.830140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.830217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.830167 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.830217 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.830184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.833665 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.833641 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:30:36.855796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.855762 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8749fd2-aa4c-421a-a967-fbec8fced636" path="/var/lib/kubelet/pods/e8749fd2-aa4c-421a-a967-fbec8fced636/volumes" Apr 17 17:30:36.931177 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931066 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931177 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/12c31a5c-562a-4116-a291-b8c4a68e7208-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931177 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-config\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931177 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12c31a5c-562a-4116-a291-b8c4a68e7208-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-web-config\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12c31a5c-562a-4116-a291-b8c4a68e7208-config-out\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.931575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.932001 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.932001 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.932001 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.932001 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931736 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.932001 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7hbl\" (UniqueName: \"kubernetes.io/projected/12c31a5c-562a-4116-a291-b8c4a68e7208-kube-api-access-v7hbl\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.932001 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.931853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/12c31a5c-562a-4116-a291-b8c4a68e7208-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.932288 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.932110 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.934507 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.934177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12c31a5c-562a-4116-a291-b8c4a68e7208-config-out\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.934507 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.934329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.934507 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.934337 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.934507 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.934427 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-config\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.934796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.934640 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12c31a5c-562a-4116-a291-b8c4a68e7208-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.935488 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.934885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.935488 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.934957 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.935488 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.935051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.935710 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.935656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.935775 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.935746 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.936699 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.936670 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/12c31a5c-562a-4116-a291-b8c4a68e7208-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.936939 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.936914 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.936998 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.936938 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-web-config\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.936998 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.936926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.937174 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.937157 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/12c31a5c-562a-4116-a291-b8c4a68e7208-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:36.940200 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:36.940178 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7hbl\" (UniqueName: \"kubernetes.io/projected/12c31a5c-562a-4116-a291-b8c4a68e7208-kube-api-access-v7hbl\") pod \"prometheus-k8s-0\" (UID: \"12c31a5c-562a-4116-a291-b8c4a68e7208\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:37.131408 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:37.131361 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:30:37.262798 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:37.262771 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:30:37.264994 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:30:37.264957 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c31a5c_562a_4116_a291_b8c4a68e7208.slice/crio-0d1bc0c73b1f99db4298358f515753ee43ddb0fabdfe698df0e717c0dd7beb19 WatchSource:0}: Error finding container 0d1bc0c73b1f99db4298358f515753ee43ddb0fabdfe698df0e717c0dd7beb19: Status 404 returned error can't find the container with id 0d1bc0c73b1f99db4298358f515753ee43ddb0fabdfe698df0e717c0dd7beb19 Apr 17 17:30:37.761851 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:37.761817 2577 generic.go:358] "Generic (PLEG): container finished" podID="12c31a5c-562a-4116-a291-b8c4a68e7208" containerID="3efa34d3c7a92f153ed868af3fedba5b9b79106b35eeff7b7db13a89ce87c9bb" exitCode=0 Apr 17 17:30:37.762221 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:37.761911 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12c31a5c-562a-4116-a291-b8c4a68e7208","Type":"ContainerDied","Data":"3efa34d3c7a92f153ed868af3fedba5b9b79106b35eeff7b7db13a89ce87c9bb"} Apr 17 17:30:37.762221 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:37.761953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12c31a5c-562a-4116-a291-b8c4a68e7208","Type":"ContainerStarted","Data":"0d1bc0c73b1f99db4298358f515753ee43ddb0fabdfe698df0e717c0dd7beb19"} Apr 17 17:30:38.768128 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:38.768086 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12c31a5c-562a-4116-a291-b8c4a68e7208","Type":"ContainerStarted","Data":"daf1955f65a9794929259cdbaf86ef8d83c38484d6a46e686e29b810509281e8"} Apr 17 17:30:38.768128 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:38.768123 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12c31a5c-562a-4116-a291-b8c4a68e7208","Type":"ContainerStarted","Data":"885eb415511e643be91dc07b9d56a38fc164488ac816b618ce88c73234c2cbe9"} Apr 17 17:30:38.768128 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:38.768137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12c31a5c-562a-4116-a291-b8c4a68e7208","Type":"ContainerStarted","Data":"e8d7ccb7342d14429969b1f2704a3aa48bd177cc48cacb61aee3d4f79ea2a576"} Apr 17 17:30:38.768615 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:38.768145 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12c31a5c-562a-4116-a291-b8c4a68e7208","Type":"ContainerStarted","Data":"3ebc6fa2d6f4ba8bbc758ed84ec104ac938998792518421b248aff789cecc7b1"} Apr 17 17:30:38.768615 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:38.768153 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12c31a5c-562a-4116-a291-b8c4a68e7208","Type":"ContainerStarted","Data":"5aae22fff5fa23d01a5cf0bf4c2531fc291889a4f846d163cb8be83b3caf1039"} Apr 17 17:30:38.768615 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:38.768161 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12c31a5c-562a-4116-a291-b8c4a68e7208","Type":"ContainerStarted","Data":"9bfead368296b4bfbeee1c174ec5b5e5e534fb6569bd6a83bf49098b061c9c9d"} Apr 17 17:30:38.795525 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:38.795455 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.795439686 podStartE2EDuration="2.795439686s" podCreationTimestamp="2026-04-17 17:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:30:38.794748646 +0000 UTC m=+256.516691645" watchObservedRunningTime="2026-04-17 17:30:38.795439686 +0000 UTC m=+256.517382683" Apr 17 17:30:42.131990 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:30:42.131934 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:31:22.737575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:31:22.737546 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:31:22.738141 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:31:22.737807 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:31:22.744811 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:31:22.744790 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:31:37.131897 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:31:37.131850 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:31:37.148040 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:31:37.148011 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:31:37.953012 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:31:37.952982 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:34:27.912269 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:27.912214 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg"] Apr 17 17:34:27.915620 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:27.915587 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:27.918415 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:27.918386 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 17:34:27.918563 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:27.918386 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 17:34:27.918563 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:27.918446 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 17:34:27.918951 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:27.918934 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 17:34:27.919072 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:27.919056 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pdn5v\"" Apr 17 17:34:27.926058 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:27.926032 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg"] Apr 17 17:34:28.026889 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.026820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxr2f\" (UniqueName: \"kubernetes.io/projected/9d5cd8cf-986d-4573-bd01-25c6a1932fd2-kube-api-access-wxr2f\") pod \"opendatahub-operator-controller-manager-77fb85d776-fqdsg\" (UID: \"9d5cd8cf-986d-4573-bd01-25c6a1932fd2\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:28.026889 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.026897 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d5cd8cf-986d-4573-bd01-25c6a1932fd2-webhook-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-fqdsg\" (UID: \"9d5cd8cf-986d-4573-bd01-25c6a1932fd2\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:28.027145 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.026988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d5cd8cf-986d-4573-bd01-25c6a1932fd2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-fqdsg\" (UID: \"9d5cd8cf-986d-4573-bd01-25c6a1932fd2\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:28.127998 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.127957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxr2f\" (UniqueName: \"kubernetes.io/projected/9d5cd8cf-986d-4573-bd01-25c6a1932fd2-kube-api-access-wxr2f\") pod \"opendatahub-operator-controller-manager-77fb85d776-fqdsg\" (UID: \"9d5cd8cf-986d-4573-bd01-25c6a1932fd2\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:28.128207 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.128008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d5cd8cf-986d-4573-bd01-25c6a1932fd2-webhook-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-fqdsg\" (UID: \"9d5cd8cf-986d-4573-bd01-25c6a1932fd2\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:28.128207 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.128059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d5cd8cf-986d-4573-bd01-25c6a1932fd2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-fqdsg\" (UID: \"9d5cd8cf-986d-4573-bd01-25c6a1932fd2\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:28.130670 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.130639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9d5cd8cf-986d-4573-bd01-25c6a1932fd2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-fqdsg\" (UID: \"9d5cd8cf-986d-4573-bd01-25c6a1932fd2\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:28.130788 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.130766 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9d5cd8cf-986d-4573-bd01-25c6a1932fd2-webhook-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-fqdsg\" (UID: \"9d5cd8cf-986d-4573-bd01-25c6a1932fd2\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:28.144832 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.144800 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxr2f\" (UniqueName: \"kubernetes.io/projected/9d5cd8cf-986d-4573-bd01-25c6a1932fd2-kube-api-access-wxr2f\") pod \"opendatahub-operator-controller-manager-77fb85d776-fqdsg\" (UID: \"9d5cd8cf-986d-4573-bd01-25c6a1932fd2\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:28.227995 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.227944 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:28.363216 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.363169 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg"] Apr 17 17:34:28.367035 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:34:28.366997 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d5cd8cf_986d_4573_bd01_25c6a1932fd2.slice/crio-3280dd8eb09d44a30775e32b8ce6a0e4146b8d6aa78ed7a4eab9672dcc32bcce WatchSource:0}: Error finding container 3280dd8eb09d44a30775e32b8ce6a0e4146b8d6aa78ed7a4eab9672dcc32bcce: Status 404 returned error can't find the container with id 3280dd8eb09d44a30775e32b8ce6a0e4146b8d6aa78ed7a4eab9672dcc32bcce Apr 17 17:34:28.368559 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.368543 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:34:28.399026 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:28.398956 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" event={"ID":"9d5cd8cf-986d-4573-bd01-25c6a1932fd2","Type":"ContainerStarted","Data":"3280dd8eb09d44a30775e32b8ce6a0e4146b8d6aa78ed7a4eab9672dcc32bcce"} Apr 17 17:34:31.409684 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:31.409642 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" event={"ID":"9d5cd8cf-986d-4573-bd01-25c6a1932fd2","Type":"ContainerStarted","Data":"78f14668c724e2513907923ba4656aa015a9912a9434dfd42fa8480e5393b6d1"} Apr 17 17:34:31.410087 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:31.409745 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:31.441500 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:31.441434 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" podStartSLOduration=2.025252643 podStartE2EDuration="4.441418725s" podCreationTimestamp="2026-04-17 17:34:27 +0000 UTC" firstStartedPulling="2026-04-17 17:34:28.368668494 +0000 UTC m=+486.090611470" lastFinishedPulling="2026-04-17 17:34:30.784834574 +0000 UTC m=+488.506777552" observedRunningTime="2026-04-17 17:34:31.440285037 +0000 UTC m=+489.162228035" watchObservedRunningTime="2026-04-17 17:34:31.441418725 +0000 UTC m=+489.163361722" Apr 17 17:34:42.415458 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:42.415427 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-fqdsg" Apr 17 17:34:46.177981 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.177942 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-598f578945-tdhf2"] Apr 17 17:34:46.181053 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.181033 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" Apr 17 17:34:46.183387 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.183363 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 17:34:46.184118 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.184097 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 17:34:46.184182 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.184130 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-k5jkm\"" Apr 17 17:34:46.191760 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.191733 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-598f578945-tdhf2"] Apr 17 17:34:46.281712 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.281663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b61e46df-1ee4-4a82-bc7d-f1596bd447d5-tls-certs\") pod \"kube-auth-proxy-598f578945-tdhf2\" (UID: \"b61e46df-1ee4-4a82-bc7d-f1596bd447d5\") " pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" Apr 17 17:34:46.281902 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.281745 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b61e46df-1ee4-4a82-bc7d-f1596bd447d5-tmp\") pod \"kube-auth-proxy-598f578945-tdhf2\" (UID: \"b61e46df-1ee4-4a82-bc7d-f1596bd447d5\") " pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" Apr 17 17:34:46.281902 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.281774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfgz5\" (UniqueName: \"kubernetes.io/projected/b61e46df-1ee4-4a82-bc7d-f1596bd447d5-kube-api-access-mfgz5\") pod \"kube-auth-proxy-598f578945-tdhf2\" (UID: \"b61e46df-1ee4-4a82-bc7d-f1596bd447d5\") " pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" Apr 17 17:34:46.382322 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.382287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b61e46df-1ee4-4a82-bc7d-f1596bd447d5-tls-certs\") pod \"kube-auth-proxy-598f578945-tdhf2\" (UID: \"b61e46df-1ee4-4a82-bc7d-f1596bd447d5\") " pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" Apr 17 17:34:46.382549 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.382352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b61e46df-1ee4-4a82-bc7d-f1596bd447d5-tmp\") pod \"kube-auth-proxy-598f578945-tdhf2\" (UID: \"b61e46df-1ee4-4a82-bc7d-f1596bd447d5\") " pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" Apr 17 17:34:46.382549 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.382382 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfgz5\" (UniqueName: \"kubernetes.io/projected/b61e46df-1ee4-4a82-bc7d-f1596bd447d5-kube-api-access-mfgz5\") pod \"kube-auth-proxy-598f578945-tdhf2\" (UID: \"b61e46df-1ee4-4a82-bc7d-f1596bd447d5\") " pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" Apr 17 17:34:46.384705 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.384677 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b61e46df-1ee4-4a82-bc7d-f1596bd447d5-tmp\") pod \"kube-auth-proxy-598f578945-tdhf2\" (UID: \"b61e46df-1ee4-4a82-bc7d-f1596bd447d5\") " pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" Apr 17 17:34:46.384927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.384909 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b61e46df-1ee4-4a82-bc7d-f1596bd447d5-tls-certs\") pod \"kube-auth-proxy-598f578945-tdhf2\" (UID: \"b61e46df-1ee4-4a82-bc7d-f1596bd447d5\") " pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" Apr 17 17:34:46.390458 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.390432 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfgz5\" (UniqueName: \"kubernetes.io/projected/b61e46df-1ee4-4a82-bc7d-f1596bd447d5-kube-api-access-mfgz5\") pod \"kube-auth-proxy-598f578945-tdhf2\" (UID: \"b61e46df-1ee4-4a82-bc7d-f1596bd447d5\") " pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" Apr 17 17:34:46.490678 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.490631 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" Apr 17 17:34:46.613595 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:46.613559 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-598f578945-tdhf2"] Apr 17 17:34:46.616961 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:34:46.616924 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb61e46df_1ee4_4a82_bc7d_f1596bd447d5.slice/crio-c9cdcf634ea385f188a6a384113743017b9d5ab566cc8b7d5f68da0740f36f2b WatchSource:0}: Error finding container c9cdcf634ea385f188a6a384113743017b9d5ab566cc8b7d5f68da0740f36f2b: Status 404 returned error can't find the container with id c9cdcf634ea385f188a6a384113743017b9d5ab566cc8b7d5f68da0740f36f2b Apr 17 17:34:47.454149 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:47.454104 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" event={"ID":"b61e46df-1ee4-4a82-bc7d-f1596bd447d5","Type":"ContainerStarted","Data":"c9cdcf634ea385f188a6a384113743017b9d5ab566cc8b7d5f68da0740f36f2b"} Apr 17 17:34:48.347796 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:48.347757 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-dx9jc"] Apr 17 17:34:48.358410 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:48.358378 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:34:48.358860 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:48.358818 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-dx9jc"] Apr 17 17:34:48.360868 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:48.360811 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-45vkv\"" Apr 17 17:34:48.361499 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:48.361299 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 17:34:48.398005 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:48.397956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nf89\" (UniqueName: \"kubernetes.io/projected/2fc59a50-5d8d-473f-ac53-ca09910df3f7-kube-api-access-4nf89\") pod \"odh-model-controller-858dbf95b8-dx9jc\" (UID: \"2fc59a50-5d8d-473f-ac53-ca09910df3f7\") " pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:34:48.398185 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:48.398060 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fc59a50-5d8d-473f-ac53-ca09910df3f7-cert\") pod \"odh-model-controller-858dbf95b8-dx9jc\" (UID: \"2fc59a50-5d8d-473f-ac53-ca09910df3f7\") " pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:34:48.498837 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:48.498795 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nf89\" (UniqueName: \"kubernetes.io/projected/2fc59a50-5d8d-473f-ac53-ca09910df3f7-kube-api-access-4nf89\") pod \"odh-model-controller-858dbf95b8-dx9jc\" (UID: \"2fc59a50-5d8d-473f-ac53-ca09910df3f7\") " pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:34:48.499324 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:48.498924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fc59a50-5d8d-473f-ac53-ca09910df3f7-cert\") pod \"odh-model-controller-858dbf95b8-dx9jc\" (UID: \"2fc59a50-5d8d-473f-ac53-ca09910df3f7\") " pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:34:48.499324 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:34:48.499060 2577 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 17:34:48.499324 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:34:48.499136 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fc59a50-5d8d-473f-ac53-ca09910df3f7-cert podName:2fc59a50-5d8d-473f-ac53-ca09910df3f7 nodeName:}" failed. No retries permitted until 2026-04-17 17:34:48.999111201 +0000 UTC m=+506.721054192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fc59a50-5d8d-473f-ac53-ca09910df3f7-cert") pod "odh-model-controller-858dbf95b8-dx9jc" (UID: "2fc59a50-5d8d-473f-ac53-ca09910df3f7") : secret "odh-model-controller-webhook-cert" not found Apr 17 17:34:48.510801 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:48.510755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nf89\" (UniqueName: \"kubernetes.io/projected/2fc59a50-5d8d-473f-ac53-ca09910df3f7-kube-api-access-4nf89\") pod \"odh-model-controller-858dbf95b8-dx9jc\" (UID: \"2fc59a50-5d8d-473f-ac53-ca09910df3f7\") " pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:34:49.002213 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:49.002161 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fc59a50-5d8d-473f-ac53-ca09910df3f7-cert\") pod \"odh-model-controller-858dbf95b8-dx9jc\" (UID: \"2fc59a50-5d8d-473f-ac53-ca09910df3f7\") " pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:34:49.004700 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:49.004671 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fc59a50-5d8d-473f-ac53-ca09910df3f7-cert\") pod \"odh-model-controller-858dbf95b8-dx9jc\" (UID: \"2fc59a50-5d8d-473f-ac53-ca09910df3f7\") " pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:34:49.274867 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:49.274760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:34:49.987708 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:49.987681 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-dx9jc"] Apr 17 17:34:49.990655 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:34:49.990623 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fc59a50_5d8d_473f_ac53_ca09910df3f7.slice/crio-262b82f548e88cda1eab4e36b759ed6a9c4dc81415892357cb58a09767bb25aa WatchSource:0}: Error finding container 262b82f548e88cda1eab4e36b759ed6a9c4dc81415892357cb58a09767bb25aa: Status 404 returned error can't find the container with id 262b82f548e88cda1eab4e36b759ed6a9c4dc81415892357cb58a09767bb25aa Apr 17 17:34:50.465385 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:50.465350 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" event={"ID":"b61e46df-1ee4-4a82-bc7d-f1596bd447d5","Type":"ContainerStarted","Data":"e2d2b6fca36fbaca7568d219cfbf487d04f0d83b94f776a8a10ce27acad41c5f"} Apr 17 17:34:50.466732 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:50.466696 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" event={"ID":"2fc59a50-5d8d-473f-ac53-ca09910df3f7","Type":"ContainerStarted","Data":"262b82f548e88cda1eab4e36b759ed6a9c4dc81415892357cb58a09767bb25aa"} Apr 17 17:34:50.482395 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:50.482325 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-598f578945-tdhf2" podStartSLOduration=1.18247239 podStartE2EDuration="4.482304039s" podCreationTimestamp="2026-04-17 17:34:46 +0000 UTC" firstStartedPulling="2026-04-17 17:34:46.619098988 +0000 UTC m=+504.341041964" lastFinishedPulling="2026-04-17 17:34:49.918930635 +0000 UTC m=+507.640873613" observedRunningTime="2026-04-17 17:34:50.481676504 +0000 UTC m=+508.203619513" watchObservedRunningTime="2026-04-17 17:34:50.482304039 +0000 UTC m=+508.204247035" Apr 17 17:34:53.476714 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:53.476673 2577 generic.go:358] "Generic (PLEG): container finished" podID="2fc59a50-5d8d-473f-ac53-ca09910df3f7" containerID="8dc230a125ac6bb74403a5bc992dbae41ee5c54c9628e561e764366745ec8317" exitCode=1 Apr 17 17:34:53.477101 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:53.476734 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" event={"ID":"2fc59a50-5d8d-473f-ac53-ca09910df3f7","Type":"ContainerDied","Data":"8dc230a125ac6bb74403a5bc992dbae41ee5c54c9628e561e764366745ec8317"} Apr 17 17:34:53.477101 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:53.476952 2577 scope.go:117] "RemoveContainer" containerID="8dc230a125ac6bb74403a5bc992dbae41ee5c54c9628e561e764366745ec8317" Apr 17 17:34:54.447770 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.447735 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-dgnmg"] Apr 17 17:34:54.450933 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.450905 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" Apr 17 17:34:54.453325 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.453299 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 17:34:54.453808 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.453785 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-g9ccs\"" Apr 17 17:34:54.461193 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.461166 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-dgnmg"] Apr 17 17:34:54.481522 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.481492 2577 generic.go:358] "Generic (PLEG): container finished" podID="2fc59a50-5d8d-473f-ac53-ca09910df3f7" containerID="2c41db7882faa9cc2ef26393c65252e12561aa7a1a6704b3227304b76c93b276" exitCode=1 Apr 17 17:34:54.481952 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.481559 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" event={"ID":"2fc59a50-5d8d-473f-ac53-ca09910df3f7","Type":"ContainerDied","Data":"2c41db7882faa9cc2ef26393c65252e12561aa7a1a6704b3227304b76c93b276"} Apr 17 17:34:54.481952 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.481599 2577 scope.go:117] "RemoveContainer" containerID="8dc230a125ac6bb74403a5bc992dbae41ee5c54c9628e561e764366745ec8317" Apr 17 17:34:54.481952 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.481822 2577 scope.go:117] "RemoveContainer" containerID="2c41db7882faa9cc2ef26393c65252e12561aa7a1a6704b3227304b76c93b276" Apr 17 17:34:54.482072 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:34:54.482052 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-dx9jc_opendatahub(2fc59a50-5d8d-473f-ac53-ca09910df3f7)\"" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" podUID="2fc59a50-5d8d-473f-ac53-ca09910df3f7" Apr 17 17:34:54.546641 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.546578 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b76aad08-c177-4c20-a889-5f4a1a011654-cert\") pod \"kserve-controller-manager-856948b99f-dgnmg\" (UID: \"b76aad08-c177-4c20-a889-5f4a1a011654\") " pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" Apr 17 17:34:54.546864 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.546678 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x89dd\" (UniqueName: \"kubernetes.io/projected/b76aad08-c177-4c20-a889-5f4a1a011654-kube-api-access-x89dd\") pod \"kserve-controller-manager-856948b99f-dgnmg\" (UID: \"b76aad08-c177-4c20-a889-5f4a1a011654\") " pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" Apr 17 17:34:54.647793 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.647700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b76aad08-c177-4c20-a889-5f4a1a011654-cert\") pod \"kserve-controller-manager-856948b99f-dgnmg\" (UID: \"b76aad08-c177-4c20-a889-5f4a1a011654\") " pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" Apr 17 17:34:54.648011 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.647824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x89dd\" (UniqueName: \"kubernetes.io/projected/b76aad08-c177-4c20-a889-5f4a1a011654-kube-api-access-x89dd\") pod \"kserve-controller-manager-856948b99f-dgnmg\" (UID: \"b76aad08-c177-4c20-a889-5f4a1a011654\") " pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" Apr 17 17:34:54.648011 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:34:54.647857 2577 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 17:34:54.648011 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:34:54.647939 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b76aad08-c177-4c20-a889-5f4a1a011654-cert podName:b76aad08-c177-4c20-a889-5f4a1a011654 nodeName:}" failed. No retries permitted until 2026-04-17 17:34:55.147916537 +0000 UTC m=+512.869859523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b76aad08-c177-4c20-a889-5f4a1a011654-cert") pod "kserve-controller-manager-856948b99f-dgnmg" (UID: "b76aad08-c177-4c20-a889-5f4a1a011654") : secret "kserve-webhook-server-cert" not found Apr 17 17:34:54.664120 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:54.664085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x89dd\" (UniqueName: \"kubernetes.io/projected/b76aad08-c177-4c20-a889-5f4a1a011654-kube-api-access-x89dd\") pod \"kserve-controller-manager-856948b99f-dgnmg\" (UID: \"b76aad08-c177-4c20-a889-5f4a1a011654\") " pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" Apr 17 17:34:55.151538 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:55.151496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b76aad08-c177-4c20-a889-5f4a1a011654-cert\") pod \"kserve-controller-manager-856948b99f-dgnmg\" (UID: \"b76aad08-c177-4c20-a889-5f4a1a011654\") " pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" Apr 17 17:34:55.154012 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:55.153981 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b76aad08-c177-4c20-a889-5f4a1a011654-cert\") pod \"kserve-controller-manager-856948b99f-dgnmg\" (UID: \"b76aad08-c177-4c20-a889-5f4a1a011654\") " pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" Apr 17 17:34:55.362648 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:55.362610 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" Apr 17 17:34:55.483079 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:55.483054 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-dgnmg"] Apr 17 17:34:55.485409 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:34:55.485367 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb76aad08_c177_4c20_a889_5f4a1a011654.slice/crio-7fc7ddcd3e5eed1e13cef5b723c841260f9fa3ed7319962af58a73645097176d WatchSource:0}: Error finding container 7fc7ddcd3e5eed1e13cef5b723c841260f9fa3ed7319962af58a73645097176d: Status 404 returned error can't find the container with id 7fc7ddcd3e5eed1e13cef5b723c841260f9fa3ed7319962af58a73645097176d Apr 17 17:34:55.487107 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:55.487090 2577 scope.go:117] "RemoveContainer" containerID="2c41db7882faa9cc2ef26393c65252e12561aa7a1a6704b3227304b76c93b276" Apr 17 17:34:55.487289 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:34:55.487271 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-dx9jc_opendatahub(2fc59a50-5d8d-473f-ac53-ca09910df3f7)\"" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" podUID="2fc59a50-5d8d-473f-ac53-ca09910df3f7" Apr 17 17:34:56.492091 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:56.492053 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" event={"ID":"b76aad08-c177-4c20-a889-5f4a1a011654","Type":"ContainerStarted","Data":"7fc7ddcd3e5eed1e13cef5b723c841260f9fa3ed7319962af58a73645097176d"} Apr 17 17:34:58.413795 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.413692 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx"] Apr 17 17:34:58.417100 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.417083 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" Apr 17 17:34:58.422772 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.422743 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 17:34:58.422914 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.422778 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-q4bcs\"" Apr 17 17:34:58.422914 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.422887 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 17:34:58.447528 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.447491 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx"] Apr 17 17:34:58.481659 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.481615 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/afac0c52-16b5-4a02-8a08-2128a88bd69c-operator-config\") pod \"servicemesh-operator3-55f49c5f94-sk2wx\" (UID: \"afac0c52-16b5-4a02-8a08-2128a88bd69c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" Apr 17 17:34:58.481840 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.481692 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmnnn\" (UniqueName: \"kubernetes.io/projected/afac0c52-16b5-4a02-8a08-2128a88bd69c-kube-api-access-fmnnn\") pod \"servicemesh-operator3-55f49c5f94-sk2wx\" (UID: \"afac0c52-16b5-4a02-8a08-2128a88bd69c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" Apr 17 17:34:58.500246 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.500207 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" event={"ID":"b76aad08-c177-4c20-a889-5f4a1a011654","Type":"ContainerStarted","Data":"7e9088833c93f7cfebe01524cd8629bdef0290381638543f32b57ae009295c02"} Apr 17 17:34:58.500428 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.500336 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" Apr 17 17:34:58.531851 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.531794 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" podStartSLOduration=2.33791325 podStartE2EDuration="4.531777452s" podCreationTimestamp="2026-04-17 17:34:54 +0000 UTC" firstStartedPulling="2026-04-17 17:34:55.486755094 +0000 UTC m=+513.208698071" lastFinishedPulling="2026-04-17 17:34:57.680619293 +0000 UTC m=+515.402562273" observedRunningTime="2026-04-17 17:34:58.530543248 +0000 UTC m=+516.252486247" watchObservedRunningTime="2026-04-17 17:34:58.531777452 +0000 UTC m=+516.253720454" Apr 17 17:34:58.582354 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.582311 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/afac0c52-16b5-4a02-8a08-2128a88bd69c-operator-config\") pod \"servicemesh-operator3-55f49c5f94-sk2wx\" (UID: \"afac0c52-16b5-4a02-8a08-2128a88bd69c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" Apr 17 17:34:58.582547 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.582399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmnnn\" (UniqueName: \"kubernetes.io/projected/afac0c52-16b5-4a02-8a08-2128a88bd69c-kube-api-access-fmnnn\") pod \"servicemesh-operator3-55f49c5f94-sk2wx\" (UID: \"afac0c52-16b5-4a02-8a08-2128a88bd69c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" Apr 17 17:34:58.584961 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.584935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/afac0c52-16b5-4a02-8a08-2128a88bd69c-operator-config\") pod \"servicemesh-operator3-55f49c5f94-sk2wx\" (UID: \"afac0c52-16b5-4a02-8a08-2128a88bd69c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" Apr 17 17:34:58.593770 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.593742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmnnn\" (UniqueName: \"kubernetes.io/projected/afac0c52-16b5-4a02-8a08-2128a88bd69c-kube-api-access-fmnnn\") pod \"servicemesh-operator3-55f49c5f94-sk2wx\" (UID: \"afac0c52-16b5-4a02-8a08-2128a88bd69c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" Apr 17 17:34:58.726728 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.726671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" Apr 17 17:34:58.855348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:58.855327 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx"] Apr 17 17:34:58.858239 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:34:58.858201 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafac0c52_16b5_4a02_8a08_2128a88bd69c.slice/crio-848458826b168f9e53d6cafb62f09f0dadb95de80bf4c9b4674f64b34b6b592c WatchSource:0}: Error finding container 848458826b168f9e53d6cafb62f09f0dadb95de80bf4c9b4674f64b34b6b592c: Status 404 returned error can't find the container with id 848458826b168f9e53d6cafb62f09f0dadb95de80bf4c9b4674f64b34b6b592c Apr 17 17:34:59.275348 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:59.275312 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:34:59.275771 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:59.275756 2577 scope.go:117] "RemoveContainer" containerID="2c41db7882faa9cc2ef26393c65252e12561aa7a1a6704b3227304b76c93b276" Apr 17 17:34:59.276001 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:34:59.275982 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-dx9jc_opendatahub(2fc59a50-5d8d-473f-ac53-ca09910df3f7)\"" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" podUID="2fc59a50-5d8d-473f-ac53-ca09910df3f7" Apr 17 17:34:59.505197 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:34:59.505163 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" event={"ID":"afac0c52-16b5-4a02-8a08-2128a88bd69c","Type":"ContainerStarted","Data":"848458826b168f9e53d6cafb62f09f0dadb95de80bf4c9b4674f64b34b6b592c"} Apr 17 17:35:01.514621 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:01.514575 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" event={"ID":"afac0c52-16b5-4a02-8a08-2128a88bd69c","Type":"ContainerStarted","Data":"5760ccebced36b82221cca609e72f3b07954f80c0e4ad94fd675e003069a5f71"} Apr 17 17:35:01.515074 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:01.514718 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" Apr 17 17:35:01.540858 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:01.540792 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" podStartSLOduration=1.033175019 podStartE2EDuration="3.540773202s" podCreationTimestamp="2026-04-17 17:34:58 +0000 UTC" firstStartedPulling="2026-04-17 17:34:58.860564008 +0000 UTC m=+516.582506984" lastFinishedPulling="2026-04-17 17:35:01.368162187 +0000 UTC m=+519.090105167" observedRunningTime="2026-04-17 17:35:01.539839187 +0000 UTC m=+519.261782188" watchObservedRunningTime="2026-04-17 17:35:01.540773202 +0000 UTC m=+519.262716201" Apr 17 17:35:05.129297 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.129252 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl"] Apr 17 17:35:05.139071 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.139043 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.141288 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.141260 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 17:35:05.141431 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.141268 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 17:35:05.141431 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.141270 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 17:35:05.141431 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.141367 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-v2487\"" Apr 17 17:35:05.141603 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.141516 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 17:35:05.144722 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.144696 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl"] Apr 17 17:35:05.235299 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.235265 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/123dd202-8527-42e2-84da-fd93872dfcb8-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.235299 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.235306 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/123dd202-8527-42e2-84da-fd93872dfcb8-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.235548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.235325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/123dd202-8527-42e2-84da-fd93872dfcb8-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.235548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.235395 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/123dd202-8527-42e2-84da-fd93872dfcb8-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.235548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.235503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bksr8\" (UniqueName: \"kubernetes.io/projected/123dd202-8527-42e2-84da-fd93872dfcb8-kube-api-access-bksr8\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.235548 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.235538 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/123dd202-8527-42e2-84da-fd93872dfcb8-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.235673 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.235569 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/123dd202-8527-42e2-84da-fd93872dfcb8-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.336333 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.336247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/123dd202-8527-42e2-84da-fd93872dfcb8-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.336333 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.336307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/123dd202-8527-42e2-84da-fd93872dfcb8-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.336604 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.336362 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/123dd202-8527-42e2-84da-fd93872dfcb8-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.336604 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.336385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/123dd202-8527-42e2-84da-fd93872dfcb8-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.336604 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.336457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/123dd202-8527-42e2-84da-fd93872dfcb8-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.336604 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.336516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bksr8\" (UniqueName: \"kubernetes.io/projected/123dd202-8527-42e2-84da-fd93872dfcb8-kube-api-access-bksr8\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.336604 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.336547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/123dd202-8527-42e2-84da-fd93872dfcb8-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.337284 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.337254 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/123dd202-8527-42e2-84da-fd93872dfcb8-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.339014 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.338990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/123dd202-8527-42e2-84da-fd93872dfcb8-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.339014 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.339002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/123dd202-8527-42e2-84da-fd93872dfcb8-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.339215 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.339115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/123dd202-8527-42e2-84da-fd93872dfcb8-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.339277 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.339212 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/123dd202-8527-42e2-84da-fd93872dfcb8-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.345268 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.345241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/123dd202-8527-42e2-84da-fd93872dfcb8-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.345716 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.345672 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bksr8\" (UniqueName: \"kubernetes.io/projected/123dd202-8527-42e2-84da-fd93872dfcb8-kube-api-access-bksr8\") pod \"istiod-openshift-gateway-55ff986f96-26jpl\" (UID: \"123dd202-8527-42e2-84da-fd93872dfcb8\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.449772 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.449725 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:05.586775 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:05.586741 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl"] Apr 17 17:35:05.589166 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:35:05.589126 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod123dd202_8527_42e2_84da_fd93872dfcb8.slice/crio-7de0a663921e3ea64686139a17aa793867b88a31bf4e4d91c424307de3d64b32 WatchSource:0}: Error finding container 7de0a663921e3ea64686139a17aa793867b88a31bf4e4d91c424307de3d64b32: Status 404 returned error can't find the container with id 7de0a663921e3ea64686139a17aa793867b88a31bf4e4d91c424307de3d64b32 Apr 17 17:35:06.537499 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:06.537428 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" event={"ID":"123dd202-8527-42e2-84da-fd93872dfcb8","Type":"ContainerStarted","Data":"7de0a663921e3ea64686139a17aa793867b88a31bf4e4d91c424307de3d64b32"} Apr 17 17:35:08.247788 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:08.247742 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 17:35:08.248176 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:08.247826 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 17:35:08.546599 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:08.546500 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" event={"ID":"123dd202-8527-42e2-84da-fd93872dfcb8","Type":"ContainerStarted","Data":"21cb2c81c8e6b9d24b93d3203a9dcb37bff3a5d4e953c33698e90cfaea62f733"} Apr 17 17:35:08.546599 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:08.546558 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:08.568023 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:08.567969 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" podStartSLOduration=0.911711384 podStartE2EDuration="3.56795422s" podCreationTimestamp="2026-04-17 17:35:05 +0000 UTC" firstStartedPulling="2026-04-17 17:35:05.591268265 +0000 UTC m=+523.313211241" lastFinishedPulling="2026-04-17 17:35:08.247511102 +0000 UTC m=+525.969454077" observedRunningTime="2026-04-17 17:35:08.566379847 +0000 UTC m=+526.288322846" watchObservedRunningTime="2026-04-17 17:35:08.56795422 +0000 UTC m=+526.289897218" Apr 17 17:35:09.275631 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:09.275594 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:35:09.276084 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:09.276067 2577 scope.go:117] "RemoveContainer" containerID="2c41db7882faa9cc2ef26393c65252e12561aa7a1a6704b3227304b76c93b276" Apr 17 17:35:09.553834 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:09.553801 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-26jpl" Apr 17 17:35:10.557001 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:10.556960 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" event={"ID":"2fc59a50-5d8d-473f-ac53-ca09910df3f7","Type":"ContainerStarted","Data":"28bee3a2fd9f6cc59fd4528de493b59b157ba496ef8953770fa5cabdc900d5f3"} Apr 17 17:35:10.557382 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:10.557306 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:35:10.575110 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:10.575039 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" podStartSLOduration=3.025033017 podStartE2EDuration="22.575019669s" podCreationTimestamp="2026-04-17 17:34:48 +0000 UTC" firstStartedPulling="2026-04-17 17:34:49.991941929 +0000 UTC m=+507.713884905" lastFinishedPulling="2026-04-17 17:35:09.541928578 +0000 UTC m=+527.263871557" observedRunningTime="2026-04-17 17:35:10.573996455 +0000 UTC m=+528.295939453" watchObservedRunningTime="2026-04-17 17:35:10.575019669 +0000 UTC m=+528.296962668" Apr 17 17:35:12.519858 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:12.519831 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sk2wx" Apr 17 17:35:21.563574 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:21.563537 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-dx9jc" Apr 17 17:35:29.510385 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:35:29.510348 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-dgnmg" Apr 17 17:36:21.557988 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:21.557947 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-bwhzb"] Apr 17 17:36:21.560374 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:21.560349 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-bwhzb" Apr 17 17:36:21.562960 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:21.562936 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:36:21.563676 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:21.563656 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-7zjvq\"" Apr 17 17:36:21.563791 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:21.563692 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:36:21.571335 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:21.571300 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-bwhzb"] Apr 17 17:36:21.697972 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:21.697916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78bct\" (UniqueName: \"kubernetes.io/projected/e3cecd69-f99b-4b43-98d1-d0c4e9373192-kube-api-access-78bct\") pod \"authorino-operator-657f44b778-bwhzb\" (UID: \"e3cecd69-f99b-4b43-98d1-d0c4e9373192\") " pod="kuadrant-system/authorino-operator-657f44b778-bwhzb" Apr 17 17:36:21.799342 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:21.799290 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78bct\" (UniqueName: \"kubernetes.io/projected/e3cecd69-f99b-4b43-98d1-d0c4e9373192-kube-api-access-78bct\") pod \"authorino-operator-657f44b778-bwhzb\" (UID: \"e3cecd69-f99b-4b43-98d1-d0c4e9373192\") " pod="kuadrant-system/authorino-operator-657f44b778-bwhzb" Apr 17 17:36:21.807718 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:21.807689 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78bct\" (UniqueName: \"kubernetes.io/projected/e3cecd69-f99b-4b43-98d1-d0c4e9373192-kube-api-access-78bct\") pod \"authorino-operator-657f44b778-bwhzb\" (UID: \"e3cecd69-f99b-4b43-98d1-d0c4e9373192\") " pod="kuadrant-system/authorino-operator-657f44b778-bwhzb" Apr 17 17:36:21.871589 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:21.871497 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-bwhzb" Apr 17 17:36:22.006344 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:22.006315 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-bwhzb"] Apr 17 17:36:22.009692 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:36:22.009658 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3cecd69_f99b_4b43_98d1_d0c4e9373192.slice/crio-5b9389aba4e6db0538d53593ce57cb78fbd98ee8166a6ab5fb7aa08d94f1cb60 WatchSource:0}: Error finding container 5b9389aba4e6db0538d53593ce57cb78fbd98ee8166a6ab5fb7aa08d94f1cb60: Status 404 returned error can't find the container with id 5b9389aba4e6db0538d53593ce57cb78fbd98ee8166a6ab5fb7aa08d94f1cb60 Apr 17 17:36:22.768050 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:22.768018 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:36:22.768641 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:22.768619 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:36:22.794277 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:22.794240 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-bwhzb" event={"ID":"e3cecd69-f99b-4b43-98d1-d0c4e9373192","Type":"ContainerStarted","Data":"5b9389aba4e6db0538d53593ce57cb78fbd98ee8166a6ab5fb7aa08d94f1cb60"} Apr 17 17:36:23.798574 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:23.798457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-bwhzb" event={"ID":"e3cecd69-f99b-4b43-98d1-d0c4e9373192","Type":"ContainerStarted","Data":"4119c5a44d1bdc3e39e05a5c53b08d42aeb156f9ffbb5a4b3831f9e9f78ff967"} Apr 17 17:36:23.798927 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:23.798644 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-bwhzb" Apr 17 17:36:23.819237 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:23.819182 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-bwhzb" podStartSLOduration=1.379616682 podStartE2EDuration="2.819161595s" podCreationTimestamp="2026-04-17 17:36:21 +0000 UTC" firstStartedPulling="2026-04-17 17:36:22.012218258 +0000 UTC m=+599.734161233" lastFinishedPulling="2026-04-17 17:36:23.451763166 +0000 UTC m=+601.173706146" observedRunningTime="2026-04-17 17:36:23.818288247 +0000 UTC m=+601.540231245" watchObservedRunningTime="2026-04-17 17:36:23.819161595 +0000 UTC m=+601.541104594" Apr 17 17:36:34.803743 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:34.803703 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-bwhzb" Apr 17 17:36:48.570422 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:48.570380 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6"] Apr 17 17:36:48.572335 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:48.572318 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" Apr 17 17:36:48.575026 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:48.575002 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-tx4w5\"" Apr 17 17:36:48.585752 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:48.585723 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6"] Apr 17 17:36:48.630941 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:48.630899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/810ca993-fb2d-427d-9e0a-dcb6508ff042-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6\" (UID: \"810ca993-fb2d-427d-9e0a-dcb6508ff042\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" Apr 17 17:36:48.631117 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:48.630947 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbdtx\" (UniqueName: \"kubernetes.io/projected/810ca993-fb2d-427d-9e0a-dcb6508ff042-kube-api-access-hbdtx\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6\" (UID: \"810ca993-fb2d-427d-9e0a-dcb6508ff042\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" Apr 17 17:36:48.732339 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:48.732280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/810ca993-fb2d-427d-9e0a-dcb6508ff042-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6\" (UID: \"810ca993-fb2d-427d-9e0a-dcb6508ff042\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" Apr 17 17:36:48.732339 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:48.732339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbdtx\" (UniqueName: \"kubernetes.io/projected/810ca993-fb2d-427d-9e0a-dcb6508ff042-kube-api-access-hbdtx\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6\" (UID: \"810ca993-fb2d-427d-9e0a-dcb6508ff042\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" Apr 17 17:36:48.732776 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:48.732752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/810ca993-fb2d-427d-9e0a-dcb6508ff042-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6\" (UID: \"810ca993-fb2d-427d-9e0a-dcb6508ff042\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" Apr 17 17:36:48.740855 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:48.740822 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbdtx\" (UniqueName: \"kubernetes.io/projected/810ca993-fb2d-427d-9e0a-dcb6508ff042-kube-api-access-hbdtx\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6\" (UID: \"810ca993-fb2d-427d-9e0a-dcb6508ff042\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" Apr 17 17:36:48.881977 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:48.881884 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" Apr 17 17:36:49.013671 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:49.013630 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6"] Apr 17 17:36:49.017396 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:36:49.017366 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod810ca993_fb2d_427d_9e0a_dcb6508ff042.slice/crio-eca9262fecaba146565cd20927dab7dce793fe68f590f5e93c1dc3afc2b90944 WatchSource:0}: Error finding container eca9262fecaba146565cd20927dab7dce793fe68f590f5e93c1dc3afc2b90944: Status 404 returned error can't find the container with id eca9262fecaba146565cd20927dab7dce793fe68f590f5e93c1dc3afc2b90944 Apr 17 17:36:49.889920 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:49.889873 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" event={"ID":"810ca993-fb2d-427d-9e0a-dcb6508ff042","Type":"ContainerStarted","Data":"eca9262fecaba146565cd20927dab7dce793fe68f590f5e93c1dc3afc2b90944"} Apr 17 17:36:53.903813 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:53.903775 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" event={"ID":"810ca993-fb2d-427d-9e0a-dcb6508ff042","Type":"ContainerStarted","Data":"2b5e07523ab4136cc3e6d071ded3769abb8a8f0809b1035e79dea2a60ab0e709"} Apr 17 17:36:53.904258 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:53.903850 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" Apr 17 17:36:53.926906 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:36:53.926847 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" podStartSLOduration=1.750824577 podStartE2EDuration="5.926817289s" podCreationTimestamp="2026-04-17 17:36:48 +0000 UTC" firstStartedPulling="2026-04-17 17:36:49.019765783 +0000 UTC m=+626.741708758" lastFinishedPulling="2026-04-17 17:36:53.195758488 +0000 UTC m=+630.917701470" observedRunningTime="2026-04-17 17:36:53.925010142 +0000 UTC m=+631.646953154" watchObservedRunningTime="2026-04-17 17:36:53.926817289 +0000 UTC m=+631.648760332" Apr 17 17:37:04.909911 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:37:04.909875 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6" Apr 17 17:38:55.062645 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.062603 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-595fbdcc8c-6f6ln"] Apr 17 17:38:55.065957 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.065934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" Apr 17 17:38:55.068119 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.068094 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7fd6d6f45d-mb7sk"] Apr 17 17:38:55.068436 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.068416 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-gtt7d\"" Apr 17 17:38:55.068656 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.068637 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 17:38:55.071236 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.071218 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:38:55.073949 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.073929 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 17:38:55.076667 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.076649 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-gl78k\"" Apr 17 17:38:55.086833 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.086802 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-595fbdcc8c-6f6ln"] Apr 17 17:38:55.094542 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.094509 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7fd6d6f45d-mb7sk"] Apr 17 17:38:55.101501 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.101446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3582fe2a-7a39-4f73-89ea-4fc94d053765-maas-api-tls\") pod \"maas-api-7fd6d6f45d-mb7sk\" (UID: \"3582fe2a-7a39-4f73-89ea-4fc94d053765\") " pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:38:55.101772 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.101509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6qg\" (UniqueName: \"kubernetes.io/projected/3582fe2a-7a39-4f73-89ea-4fc94d053765-kube-api-access-kb6qg\") pod \"maas-api-7fd6d6f45d-mb7sk\" (UID: \"3582fe2a-7a39-4f73-89ea-4fc94d053765\") " pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:38:55.101772 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.101599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8gpp\" (UniqueName: \"kubernetes.io/projected/30d13474-31ec-4592-a7c5-38cfafc3dcb1-kube-api-access-t8gpp\") pod \"maas-controller-595fbdcc8c-6f6ln\" (UID: \"30d13474-31ec-4592-a7c5-38cfafc3dcb1\") " pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" Apr 17 17:38:55.202988 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.202950 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3582fe2a-7a39-4f73-89ea-4fc94d053765-maas-api-tls\") pod \"maas-api-7fd6d6f45d-mb7sk\" (UID: \"3582fe2a-7a39-4f73-89ea-4fc94d053765\") " pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:38:55.203186 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.203006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6qg\" (UniqueName: \"kubernetes.io/projected/3582fe2a-7a39-4f73-89ea-4fc94d053765-kube-api-access-kb6qg\") pod \"maas-api-7fd6d6f45d-mb7sk\" (UID: \"3582fe2a-7a39-4f73-89ea-4fc94d053765\") " pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:38:55.203186 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.203048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8gpp\" (UniqueName: \"kubernetes.io/projected/30d13474-31ec-4592-a7c5-38cfafc3dcb1-kube-api-access-t8gpp\") pod \"maas-controller-595fbdcc8c-6f6ln\" (UID: \"30d13474-31ec-4592-a7c5-38cfafc3dcb1\") " pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" Apr 17 17:38:55.203186 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:38:55.203110 2577 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 17 17:38:55.203186 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:38:55.203173 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3582fe2a-7a39-4f73-89ea-4fc94d053765-maas-api-tls podName:3582fe2a-7a39-4f73-89ea-4fc94d053765 nodeName:}" failed. No retries permitted until 2026-04-17 17:38:55.703155877 +0000 UTC m=+753.425098858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/3582fe2a-7a39-4f73-89ea-4fc94d053765-maas-api-tls") pod "maas-api-7fd6d6f45d-mb7sk" (UID: "3582fe2a-7a39-4f73-89ea-4fc94d053765") : secret "maas-api-serving-cert" not found Apr 17 17:38:55.213575 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.213549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6qg\" (UniqueName: \"kubernetes.io/projected/3582fe2a-7a39-4f73-89ea-4fc94d053765-kube-api-access-kb6qg\") pod \"maas-api-7fd6d6f45d-mb7sk\" (UID: \"3582fe2a-7a39-4f73-89ea-4fc94d053765\") " pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:38:55.213722 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.213573 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8gpp\" (UniqueName: \"kubernetes.io/projected/30d13474-31ec-4592-a7c5-38cfafc3dcb1-kube-api-access-t8gpp\") pod \"maas-controller-595fbdcc8c-6f6ln\" (UID: \"30d13474-31ec-4592-a7c5-38cfafc3dcb1\") " pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" Apr 17 17:38:55.377968 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.377874 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" Apr 17 17:38:55.505338 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.505306 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-595fbdcc8c-6f6ln"] Apr 17 17:38:55.507449 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:38:55.507422 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30d13474_31ec_4592_a7c5_38cfafc3dcb1.slice/crio-89b96f4995616880d0b00b2089b31b1e8d57b0a32e226144e18b998369a07a53 WatchSource:0}: Error finding container 89b96f4995616880d0b00b2089b31b1e8d57b0a32e226144e18b998369a07a53: Status 404 returned error can't find the container with id 89b96f4995616880d0b00b2089b31b1e8d57b0a32e226144e18b998369a07a53 Apr 17 17:38:55.707965 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.707916 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3582fe2a-7a39-4f73-89ea-4fc94d053765-maas-api-tls\") pod \"maas-api-7fd6d6f45d-mb7sk\" (UID: \"3582fe2a-7a39-4f73-89ea-4fc94d053765\") " pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:38:55.710654 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.710627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3582fe2a-7a39-4f73-89ea-4fc94d053765-maas-api-tls\") pod \"maas-api-7fd6d6f45d-mb7sk\" (UID: \"3582fe2a-7a39-4f73-89ea-4fc94d053765\") " pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:38:55.972149 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.972050 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-5987f9d78-75gk6"] Apr 17 17:38:55.976880 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.976849 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5987f9d78-75gk6" Apr 17 17:38:55.983905 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.983870 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5987f9d78-75gk6"] Apr 17 17:38:55.984109 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:55.984085 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:38:56.010952 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:56.010918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxf6q\" (UniqueName: \"kubernetes.io/projected/a39964b6-f1a0-4f19-ba7e-102c6db18d0d-kube-api-access-cxf6q\") pod \"maas-api-5987f9d78-75gk6\" (UID: \"a39964b6-f1a0-4f19-ba7e-102c6db18d0d\") " pod="opendatahub/maas-api-5987f9d78-75gk6" Apr 17 17:38:56.011136 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:56.010973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a39964b6-f1a0-4f19-ba7e-102c6db18d0d-maas-api-tls\") pod \"maas-api-5987f9d78-75gk6\" (UID: \"a39964b6-f1a0-4f19-ba7e-102c6db18d0d\") " pod="opendatahub/maas-api-5987f9d78-75gk6" Apr 17 17:38:56.112502 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:56.112444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxf6q\" (UniqueName: \"kubernetes.io/projected/a39964b6-f1a0-4f19-ba7e-102c6db18d0d-kube-api-access-cxf6q\") pod \"maas-api-5987f9d78-75gk6\" (UID: \"a39964b6-f1a0-4f19-ba7e-102c6db18d0d\") " pod="opendatahub/maas-api-5987f9d78-75gk6" Apr 17 17:38:56.113129 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:56.112540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a39964b6-f1a0-4f19-ba7e-102c6db18d0d-maas-api-tls\") pod \"maas-api-5987f9d78-75gk6\" (UID: \"a39964b6-f1a0-4f19-ba7e-102c6db18d0d\") " pod="opendatahub/maas-api-5987f9d78-75gk6" Apr 17 17:38:56.115616 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:56.115588 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a39964b6-f1a0-4f19-ba7e-102c6db18d0d-maas-api-tls\") pod \"maas-api-5987f9d78-75gk6\" (UID: \"a39964b6-f1a0-4f19-ba7e-102c6db18d0d\") " pod="opendatahub/maas-api-5987f9d78-75gk6" Apr 17 17:38:56.128784 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:56.128744 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxf6q\" (UniqueName: \"kubernetes.io/projected/a39964b6-f1a0-4f19-ba7e-102c6db18d0d-kube-api-access-cxf6q\") pod \"maas-api-5987f9d78-75gk6\" (UID: \"a39964b6-f1a0-4f19-ba7e-102c6db18d0d\") " pod="opendatahub/maas-api-5987f9d78-75gk6" Apr 17 17:38:56.141448 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:56.140972 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7fd6d6f45d-mb7sk"] Apr 17 17:38:56.144914 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:38:56.144878 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3582fe2a_7a39_4f73_89ea_4fc94d053765.slice/crio-3e2faf74cd8b86bda5662ba8fb62e434ba464bffddaaf44d23c1f9dad230602a WatchSource:0}: Error finding container 3e2faf74cd8b86bda5662ba8fb62e434ba464bffddaaf44d23c1f9dad230602a: Status 404 returned error can't find the container with id 3e2faf74cd8b86bda5662ba8fb62e434ba464bffddaaf44d23c1f9dad230602a Apr 17 17:38:56.289752 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:56.289643 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5987f9d78-75gk6" Apr 17 17:38:56.317944 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:56.317901 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" event={"ID":"30d13474-31ec-4592-a7c5-38cfafc3dcb1","Type":"ContainerStarted","Data":"89b96f4995616880d0b00b2089b31b1e8d57b0a32e226144e18b998369a07a53"} Apr 17 17:38:56.319493 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:56.319440 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" event={"ID":"3582fe2a-7a39-4f73-89ea-4fc94d053765","Type":"ContainerStarted","Data":"3e2faf74cd8b86bda5662ba8fb62e434ba464bffddaaf44d23c1f9dad230602a"} Apr 17 17:38:56.456154 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:56.456112 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5987f9d78-75gk6"] Apr 17 17:38:56.460647 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:38:56.460608 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda39964b6_f1a0_4f19_ba7e_102c6db18d0d.slice/crio-4eb69375586b39a6e5a570032b106eafea889d809b27843e2925beb249a5c83c WatchSource:0}: Error finding container 4eb69375586b39a6e5a570032b106eafea889d809b27843e2925beb249a5c83c: Status 404 returned error can't find the container with id 4eb69375586b39a6e5a570032b106eafea889d809b27843e2925beb249a5c83c Apr 17 17:38:57.329447 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:57.329304 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5987f9d78-75gk6" event={"ID":"a39964b6-f1a0-4f19-ba7e-102c6db18d0d","Type":"ContainerStarted","Data":"4eb69375586b39a6e5a570032b106eafea889d809b27843e2925beb249a5c83c"} Apr 17 17:38:59.339914 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:59.339875 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" event={"ID":"3582fe2a-7a39-4f73-89ea-4fc94d053765","Type":"ContainerStarted","Data":"1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4"} Apr 17 17:38:59.340384 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:59.340017 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:38:59.341397 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:59.341376 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5987f9d78-75gk6" event={"ID":"a39964b6-f1a0-4f19-ba7e-102c6db18d0d","Type":"ContainerStarted","Data":"66a752087c1ed7a1401c7b6afa7a804c44c2fa19bd5e775dc6c287955ac0edc3"} Apr 17 17:38:59.341539 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:59.341523 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-5987f9d78-75gk6" Apr 17 17:38:59.342743 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:59.342717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" event={"ID":"30d13474-31ec-4592-a7c5-38cfafc3dcb1","Type":"ContainerStarted","Data":"46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f"} Apr 17 17:38:59.342838 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:59.342752 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" Apr 17 17:38:59.360983 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:59.360930 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" podStartSLOduration=1.8521105260000001 podStartE2EDuration="4.360914559s" podCreationTimestamp="2026-04-17 17:38:55 +0000 UTC" firstStartedPulling="2026-04-17 17:38:56.147444981 +0000 UTC m=+753.869387961" lastFinishedPulling="2026-04-17 17:38:58.656249009 +0000 UTC m=+756.378191994" observedRunningTime="2026-04-17 17:38:59.358045512 +0000 UTC m=+757.079988509" watchObservedRunningTime="2026-04-17 17:38:59.360914559 +0000 UTC m=+757.082857548" Apr 17 17:38:59.378552 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:59.378492 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" podStartSLOduration=1.237053592 podStartE2EDuration="4.378456411s" podCreationTimestamp="2026-04-17 17:38:55 +0000 UTC" firstStartedPulling="2026-04-17 17:38:55.508715399 +0000 UTC m=+753.230658379" lastFinishedPulling="2026-04-17 17:38:58.650118207 +0000 UTC m=+756.372061198" observedRunningTime="2026-04-17 17:38:59.377934732 +0000 UTC m=+757.099877744" watchObservedRunningTime="2026-04-17 17:38:59.378456411 +0000 UTC m=+757.100399410" Apr 17 17:38:59.396910 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:38:59.396838 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-5987f9d78-75gk6" podStartSLOduration=2.209614788 podStartE2EDuration="4.396821678s" podCreationTimestamp="2026-04-17 17:38:55 +0000 UTC" firstStartedPulling="2026-04-17 17:38:56.462911924 +0000 UTC m=+754.184854902" lastFinishedPulling="2026-04-17 17:38:58.650118803 +0000 UTC m=+756.372061792" observedRunningTime="2026-04-17 17:38:59.396417928 +0000 UTC m=+757.118360926" watchObservedRunningTime="2026-04-17 17:38:59.396821678 +0000 UTC m=+757.118764678" Apr 17 17:39:05.351290 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:05.351259 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:39:05.351742 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:05.351310 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-5987f9d78-75gk6" Apr 17 17:39:05.411838 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:05.411805 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7fd6d6f45d-mb7sk"] Apr 17 17:39:05.412097 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:05.412039 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" podUID="3582fe2a-7a39-4f73-89ea-4fc94d053765" containerName="maas-api" containerID="cri-o://1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4" gracePeriod=30 Apr 17 17:39:05.665392 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:05.665364 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:39:05.699253 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:05.699219 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3582fe2a-7a39-4f73-89ea-4fc94d053765-maas-api-tls\") pod \"3582fe2a-7a39-4f73-89ea-4fc94d053765\" (UID: \"3582fe2a-7a39-4f73-89ea-4fc94d053765\") " Apr 17 17:39:05.699449 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:05.699262 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb6qg\" (UniqueName: \"kubernetes.io/projected/3582fe2a-7a39-4f73-89ea-4fc94d053765-kube-api-access-kb6qg\") pod \"3582fe2a-7a39-4f73-89ea-4fc94d053765\" (UID: \"3582fe2a-7a39-4f73-89ea-4fc94d053765\") " Apr 17 17:39:05.701572 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:05.701538 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3582fe2a-7a39-4f73-89ea-4fc94d053765-kube-api-access-kb6qg" (OuterVolumeSpecName: "kube-api-access-kb6qg") pod "3582fe2a-7a39-4f73-89ea-4fc94d053765" (UID: "3582fe2a-7a39-4f73-89ea-4fc94d053765"). InnerVolumeSpecName "kube-api-access-kb6qg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:39:05.701689 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:05.701629 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3582fe2a-7a39-4f73-89ea-4fc94d053765-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "3582fe2a-7a39-4f73-89ea-4fc94d053765" (UID: "3582fe2a-7a39-4f73-89ea-4fc94d053765"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:39:05.800414 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:05.800367 2577 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3582fe2a-7a39-4f73-89ea-4fc94d053765-maas-api-tls\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:39:05.800414 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:05.800410 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kb6qg\" (UniqueName: \"kubernetes.io/projected/3582fe2a-7a39-4f73-89ea-4fc94d053765-kube-api-access-kb6qg\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:39:06.365998 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:06.365962 2577 generic.go:358] "Generic (PLEG): container finished" podID="3582fe2a-7a39-4f73-89ea-4fc94d053765" containerID="1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4" exitCode=0 Apr 17 17:39:06.366427 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:06.366009 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" event={"ID":"3582fe2a-7a39-4f73-89ea-4fc94d053765","Type":"ContainerDied","Data":"1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4"} Apr 17 17:39:06.366427 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:06.366023 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" Apr 17 17:39:06.366427 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:06.366039 2577 scope.go:117] "RemoveContainer" containerID="1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4" Apr 17 17:39:06.366427 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:06.366030 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7fd6d6f45d-mb7sk" event={"ID":"3582fe2a-7a39-4f73-89ea-4fc94d053765","Type":"ContainerDied","Data":"3e2faf74cd8b86bda5662ba8fb62e434ba464bffddaaf44d23c1f9dad230602a"} Apr 17 17:39:06.374424 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:06.374406 2577 scope.go:117] "RemoveContainer" containerID="1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4" Apr 17 17:39:06.374701 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:39:06.374684 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4\": container with ID starting with 1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4 not found: ID does not exist" containerID="1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4" Apr 17 17:39:06.374752 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:06.374710 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4"} err="failed to get container status \"1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4\": rpc error: code = NotFound desc = could not find container \"1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4\": container with ID starting with 1b0858edfabcbdda30f8c4ee2d3db4ffa5247de05be3798b2147e8c81a4e62c4 not found: ID does not exist" Apr 17 17:39:06.386363 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:06.386336 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7fd6d6f45d-mb7sk"] Apr 17 17:39:06.390439 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:06.390414 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-7fd6d6f45d-mb7sk"] Apr 17 17:39:06.855155 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:06.855120 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3582fe2a-7a39-4f73-89ea-4fc94d053765" path="/var/lib/kubelet/pods/3582fe2a-7a39-4f73-89ea-4fc94d053765/volumes" Apr 17 17:39:10.352110 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:10.352080 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" Apr 17 17:39:10.650613 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:10.650525 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-d6bc59649-92ck8"] Apr 17 17:39:10.650893 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:10.650879 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3582fe2a-7a39-4f73-89ea-4fc94d053765" containerName="maas-api" Apr 17 17:39:10.650944 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:10.650894 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3582fe2a-7a39-4f73-89ea-4fc94d053765" containerName="maas-api" Apr 17 17:39:10.650993 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:10.650984 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3582fe2a-7a39-4f73-89ea-4fc94d053765" containerName="maas-api" Apr 17 17:39:10.654034 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:10.654009 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-d6bc59649-92ck8" Apr 17 17:39:10.662521 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:10.662487 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-d6bc59649-92ck8"] Apr 17 17:39:10.744264 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:10.744227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bcnh\" (UniqueName: \"kubernetes.io/projected/2182b691-66b0-45db-9286-8fad7c871a7a-kube-api-access-8bcnh\") pod \"maas-controller-d6bc59649-92ck8\" (UID: \"2182b691-66b0-45db-9286-8fad7c871a7a\") " pod="opendatahub/maas-controller-d6bc59649-92ck8" Apr 17 17:39:10.845123 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:10.845080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bcnh\" (UniqueName: \"kubernetes.io/projected/2182b691-66b0-45db-9286-8fad7c871a7a-kube-api-access-8bcnh\") pod \"maas-controller-d6bc59649-92ck8\" (UID: \"2182b691-66b0-45db-9286-8fad7c871a7a\") " pod="opendatahub/maas-controller-d6bc59649-92ck8" Apr 17 17:39:10.853437 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:10.853407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bcnh\" (UniqueName: \"kubernetes.io/projected/2182b691-66b0-45db-9286-8fad7c871a7a-kube-api-access-8bcnh\") pod \"maas-controller-d6bc59649-92ck8\" (UID: \"2182b691-66b0-45db-9286-8fad7c871a7a\") " pod="opendatahub/maas-controller-d6bc59649-92ck8" Apr 17 17:39:10.965454 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:10.965412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-d6bc59649-92ck8" Apr 17 17:39:11.093581 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:11.093555 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-d6bc59649-92ck8"] Apr 17 17:39:11.095820 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:39:11.095788 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2182b691_66b0_45db_9286_8fad7c871a7a.slice/crio-addee232b22c791b3fdb04c17657377d245a3c307bc75dbaef133664f4b8e1bc WatchSource:0}: Error finding container addee232b22c791b3fdb04c17657377d245a3c307bc75dbaef133664f4b8e1bc: Status 404 returned error can't find the container with id addee232b22c791b3fdb04c17657377d245a3c307bc75dbaef133664f4b8e1bc Apr 17 17:39:11.386078 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:11.385992 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-d6bc59649-92ck8" event={"ID":"2182b691-66b0-45db-9286-8fad7c871a7a","Type":"ContainerStarted","Data":"addee232b22c791b3fdb04c17657377d245a3c307bc75dbaef133664f4b8e1bc"} Apr 17 17:39:12.390402 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:12.390361 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-d6bc59649-92ck8" event={"ID":"2182b691-66b0-45db-9286-8fad7c871a7a","Type":"ContainerStarted","Data":"8f61234b39cfcb4d66062e70042cdeba7ef1b8b55566d7a4462f52f28c3da0ab"} Apr 17 17:39:12.390789 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:12.390527 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-d6bc59649-92ck8" Apr 17 17:39:12.409981 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:12.409928 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-d6bc59649-92ck8" podStartSLOduration=2.087767386 podStartE2EDuration="2.409912887s" podCreationTimestamp="2026-04-17 17:39:10 +0000 UTC" firstStartedPulling="2026-04-17 17:39:11.097091461 +0000 UTC m=+768.819034441" lastFinishedPulling="2026-04-17 17:39:11.419236952 +0000 UTC m=+769.141179942" observedRunningTime="2026-04-17 17:39:12.407718728 +0000 UTC m=+770.129661738" watchObservedRunningTime="2026-04-17 17:39:12.409912887 +0000 UTC m=+770.131855884" Apr 17 17:39:23.401920 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:23.401880 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-d6bc59649-92ck8" Apr 17 17:39:23.447932 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:23.447893 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-595fbdcc8c-6f6ln"] Apr 17 17:39:23.448167 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:23.448130 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" podUID="30d13474-31ec-4592-a7c5-38cfafc3dcb1" containerName="manager" containerID="cri-o://46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f" gracePeriod=10 Apr 17 17:39:23.690970 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:23.690944 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" Apr 17 17:39:23.761138 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:23.761092 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8gpp\" (UniqueName: \"kubernetes.io/projected/30d13474-31ec-4592-a7c5-38cfafc3dcb1-kube-api-access-t8gpp\") pod \"30d13474-31ec-4592-a7c5-38cfafc3dcb1\" (UID: \"30d13474-31ec-4592-a7c5-38cfafc3dcb1\") " Apr 17 17:39:23.763197 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:23.763167 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d13474-31ec-4592-a7c5-38cfafc3dcb1-kube-api-access-t8gpp" (OuterVolumeSpecName: "kube-api-access-t8gpp") pod "30d13474-31ec-4592-a7c5-38cfafc3dcb1" (UID: "30d13474-31ec-4592-a7c5-38cfafc3dcb1"). InnerVolumeSpecName "kube-api-access-t8gpp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:39:23.861965 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:23.861920 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t8gpp\" (UniqueName: \"kubernetes.io/projected/30d13474-31ec-4592-a7c5-38cfafc3dcb1-kube-api-access-t8gpp\") on node \"ip-10-0-139-96.ec2.internal\" DevicePath \"\"" Apr 17 17:39:24.431440 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:24.431400 2577 generic.go:358] "Generic (PLEG): container finished" podID="30d13474-31ec-4592-a7c5-38cfafc3dcb1" containerID="46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f" exitCode=0 Apr 17 17:39:24.431877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:24.431460 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" Apr 17 17:39:24.431877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:24.431491 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" event={"ID":"30d13474-31ec-4592-a7c5-38cfafc3dcb1","Type":"ContainerDied","Data":"46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f"} Apr 17 17:39:24.431877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:24.431528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-595fbdcc8c-6f6ln" event={"ID":"30d13474-31ec-4592-a7c5-38cfafc3dcb1","Type":"ContainerDied","Data":"89b96f4995616880d0b00b2089b31b1e8d57b0a32e226144e18b998369a07a53"} Apr 17 17:39:24.431877 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:24.431546 2577 scope.go:117] "RemoveContainer" containerID="46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f" Apr 17 17:39:24.440072 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:24.440051 2577 scope.go:117] "RemoveContainer" containerID="46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f" Apr 17 17:39:24.440348 ip-10-0-139-96 kubenswrapper[2577]: E0417 17:39:24.440325 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f\": container with ID starting with 46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f not found: ID does not exist" containerID="46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f" Apr 17 17:39:24.440417 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:24.440356 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f"} err="failed to get container status \"46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f\": rpc error: code = NotFound desc = could not find container \"46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f\": container with ID starting with 46b32c2ec7661b9b977905f984fbb96efb4564d1823015c02d419e8b5350352f not found: ID does not exist" Apr 17 17:39:24.454365 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:24.454329 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-595fbdcc8c-6f6ln"] Apr 17 17:39:24.459301 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:24.459267 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-595fbdcc8c-6f6ln"] Apr 17 17:39:24.856116 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:39:24.856084 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d13474-31ec-4592-a7c5-38cfafc3dcb1" path="/var/lib/kubelet/pods/30d13474-31ec-4592-a7c5-38cfafc3dcb1/volumes" Apr 17 17:40:18.887484 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:18.887430 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz"] Apr 17 17:40:18.888071 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:18.887796 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30d13474-31ec-4592-a7c5-38cfafc3dcb1" containerName="manager" Apr 17 17:40:18.888071 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:18.887808 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d13474-31ec-4592-a7c5-38cfafc3dcb1" containerName="manager" Apr 17 17:40:18.888071 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:18.887887 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="30d13474-31ec-4592-a7c5-38cfafc3dcb1" containerName="manager" Apr 17 17:40:18.891190 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:18.891170 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:18.894273 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:18.894253 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 17:40:18.894273 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:18.894262 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-f496h\"" Apr 17 17:40:18.894425 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:18.894262 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 17:40:18.894425 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:18.894262 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 17:40:18.900719 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:18.900686 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz"] Apr 17 17:40:19.024876 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.024823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvhg6\" (UniqueName: \"kubernetes.io/projected/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-kube-api-access-cvhg6\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.024876 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.024874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.025106 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.024901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.025106 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.024947 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.025106 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.024992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.025106 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.025018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.126195 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.126155 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.126195 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.126202 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.126488 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.126237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.126488 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.126269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.126488 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.126349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvhg6\" (UniqueName: \"kubernetes.io/projected/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-kube-api-access-cvhg6\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.126488 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.126371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.126708 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.126687 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.126759 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.126713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.127078 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.127049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.129093 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.129063 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.129335 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.129316 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.135054 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.135020 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvhg6\" (UniqueName: \"kubernetes.io/projected/3afd1417-92d5-4c5d-bcbe-14f74b58a85b-kube-api-access-cvhg6\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-9krmz\" (UID: \"3afd1417-92d5-4c5d-bcbe-14f74b58a85b\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.201780 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.201740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:19.335259 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.335184 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz"] Apr 17 17:40:19.338864 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:40:19.338826 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3afd1417_92d5_4c5d_bcbe_14f74b58a85b.slice/crio-6872f060d131e3b15bcaecd0c56ac89f147ae315aad022c0ae6d2a3dd6bdb4e1 WatchSource:0}: Error finding container 6872f060d131e3b15bcaecd0c56ac89f147ae315aad022c0ae6d2a3dd6bdb4e1: Status 404 returned error can't find the container with id 6872f060d131e3b15bcaecd0c56ac89f147ae315aad022c0ae6d2a3dd6bdb4e1 Apr 17 17:40:19.340906 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.340889 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:40:19.622022 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:19.621925 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" event={"ID":"3afd1417-92d5-4c5d-bcbe-14f74b58a85b","Type":"ContainerStarted","Data":"6872f060d131e3b15bcaecd0c56ac89f147ae315aad022c0ae6d2a3dd6bdb4e1"} Apr 17 17:40:26.652993 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:26.652955 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" event={"ID":"3afd1417-92d5-4c5d-bcbe-14f74b58a85b","Type":"ContainerStarted","Data":"f3d94e7cc003737a535634f8c50d292e08fac21558a337898b2bf03457f60c43"} Apr 17 17:40:34.683939 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:34.683903 2577 generic.go:358] "Generic (PLEG): container finished" podID="3afd1417-92d5-4c5d-bcbe-14f74b58a85b" containerID="f3d94e7cc003737a535634f8c50d292e08fac21558a337898b2bf03457f60c43" exitCode=0 Apr 17 17:40:34.684337 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:34.683976 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" event={"ID":"3afd1417-92d5-4c5d-bcbe-14f74b58a85b","Type":"ContainerDied","Data":"f3d94e7cc003737a535634f8c50d292e08fac21558a337898b2bf03457f60c43"} Apr 17 17:40:40.710978 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:40.710939 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" event={"ID":"3afd1417-92d5-4c5d-bcbe-14f74b58a85b","Type":"ContainerStarted","Data":"9bccf27e5a539bd559bfbe8746b15254b6330f3c2a0d70523f5e959cefe84867"} Apr 17 17:40:40.711403 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:40.711135 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:51.728696 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:51.728659 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" Apr 17 17:40:51.750091 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:40:51.750033 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-9krmz" podStartSLOduration=13.410985235 podStartE2EDuration="33.750013343s" podCreationTimestamp="2026-04-17 17:40:18 +0000 UTC" firstStartedPulling="2026-04-17 17:40:19.341015991 +0000 UTC m=+837.062958967" lastFinishedPulling="2026-04-17 17:40:39.680044097 +0000 UTC m=+857.401987075" observedRunningTime="2026-04-17 17:40:40.73124894 +0000 UTC m=+858.453191939" watchObservedRunningTime="2026-04-17 17:40:51.750013343 +0000 UTC m=+869.471956342" Apr 17 17:41:03.379622 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.379581 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v"] Apr 17 17:41:03.436446 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.436397 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v"] Apr 17 17:41:03.436638 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.436601 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.438958 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.438933 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 17:41:03.520626 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.520600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.520749 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.520638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e033e0-f02f-4990-80ec-808fd1334d64-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.520749 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.520659 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.520856 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.520759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brxqc\" (UniqueName: \"kubernetes.io/projected/d9e033e0-f02f-4990-80ec-808fd1334d64-kube-api-access-brxqc\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.520902 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.520856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.520902 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.520888 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.622354 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.622313 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.622354 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.622355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e033e0-f02f-4990-80ec-808fd1334d64-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.622654 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.622373 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.622654 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.622398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brxqc\" (UniqueName: \"kubernetes.io/projected/d9e033e0-f02f-4990-80ec-808fd1334d64-kube-api-access-brxqc\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.622654 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.622453 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.622654 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.622503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.622881 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.622822 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.622933 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.622893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.622933 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.622917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.624729 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.624708 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9e033e0-f02f-4990-80ec-808fd1334d64-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.625022 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.625004 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e033e0-f02f-4990-80ec-808fd1334d64-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.633577 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.632091 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brxqc\" (UniqueName: \"kubernetes.io/projected/d9e033e0-f02f-4990-80ec-808fd1334d64-kube-api-access-brxqc\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-jtv7v\" (UID: \"d9e033e0-f02f-4990-80ec-808fd1334d64\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.775155 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.775107 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:03.917684 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:03.915860 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v"] Apr 17 17:41:03.921023 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:41:03.920991 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9e033e0_f02f_4990_80ec_808fd1334d64.slice/crio-e49652d87724ca9b113b5c0d7f26afb100e2463f5001193721acefeeefc96d2a WatchSource:0}: Error finding container e49652d87724ca9b113b5c0d7f26afb100e2463f5001193721acefeeefc96d2a: Status 404 returned error can't find the container with id e49652d87724ca9b113b5c0d7f26afb100e2463f5001193721acefeeefc96d2a Apr 17 17:41:04.798079 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:04.798034 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" event={"ID":"d9e033e0-f02f-4990-80ec-808fd1334d64","Type":"ContainerStarted","Data":"93d12a8f714ef97f51548d79aaaa75286d5d2466714a5c251138a1a249bb2b85"} Apr 17 17:41:04.798079 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:04.798079 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" event={"ID":"d9e033e0-f02f-4990-80ec-808fd1334d64","Type":"ContainerStarted","Data":"e49652d87724ca9b113b5c0d7f26afb100e2463f5001193721acefeeefc96d2a"} Apr 17 17:41:09.816333 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:09.816297 2577 generic.go:358] "Generic (PLEG): container finished" podID="d9e033e0-f02f-4990-80ec-808fd1334d64" containerID="93d12a8f714ef97f51548d79aaaa75286d5d2466714a5c251138a1a249bb2b85" exitCode=0 Apr 17 17:41:09.816734 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:09.816340 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" event={"ID":"d9e033e0-f02f-4990-80ec-808fd1334d64","Type":"ContainerDied","Data":"93d12a8f714ef97f51548d79aaaa75286d5d2466714a5c251138a1a249bb2b85"} Apr 17 17:41:10.821189 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:10.821149 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" event={"ID":"d9e033e0-f02f-4990-80ec-808fd1334d64","Type":"ContainerStarted","Data":"c2a74912d88d2126a8e5dd653d6527708fbab2fa68d2fb613eab1edf1e65965a"} Apr 17 17:41:10.821633 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:10.821375 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:10.839909 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:10.839852 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" podStartSLOduration=7.6386056 podStartE2EDuration="7.839835086s" podCreationTimestamp="2026-04-17 17:41:03 +0000 UTC" firstStartedPulling="2026-04-17 17:41:09.816968347 +0000 UTC m=+887.538911322" lastFinishedPulling="2026-04-17 17:41:10.018197832 +0000 UTC m=+887.740140808" observedRunningTime="2026-04-17 17:41:10.838645561 +0000 UTC m=+888.560588561" watchObservedRunningTime="2026-04-17 17:41:10.839835086 +0000 UTC m=+888.561778083" Apr 17 17:41:16.689268 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.689226 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd"] Apr 17 17:41:16.696504 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.696432 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd"] Apr 17 17:41:16.696661 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.696589 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.698990 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.698964 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 17:41:16.743443 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.743401 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd0302b7-a80c-445c-ab56-c6911dbbd482-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.743688 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.743461 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.743688 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.743556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.743688 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.743588 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.743688 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.743611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.743852 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.743723 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqwwq\" (UniqueName: \"kubernetes.io/projected/dd0302b7-a80c-445c-ab56-c6911dbbd482-kube-api-access-tqwwq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.844353 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.844316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd0302b7-a80c-445c-ab56-c6911dbbd482-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.844560 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.844370 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.844560 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.844433 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.844560 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.844460 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.844560 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.844501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.844764 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.844580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqwwq\" (UniqueName: \"kubernetes.io/projected/dd0302b7-a80c-445c-ab56-c6911dbbd482-kube-api-access-tqwwq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.844963 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.844929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.845036 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.844956 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.845036 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.844998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.846794 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.846771 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd0302b7-a80c-445c-ab56-c6911dbbd482-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.847008 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.846988 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd0302b7-a80c-445c-ab56-c6911dbbd482-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:16.859405 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:16.859371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqwwq\" (UniqueName: \"kubernetes.io/projected/dd0302b7-a80c-445c-ab56-c6911dbbd482-kube-api-access-tqwwq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd\" (UID: \"dd0302b7-a80c-445c-ab56-c6911dbbd482\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:17.011188 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:17.011102 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:17.161708 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:17.161648 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd"] Apr 17 17:41:17.167366 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:41:17.167338 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd0302b7_a80c_445c_ab56_c6911dbbd482.slice/crio-b30d32e4b90a95e6ba945a958c4b4c3a240780f149e134fe97cfd30fde1de299 WatchSource:0}: Error finding container b30d32e4b90a95e6ba945a958c4b4c3a240780f149e134fe97cfd30fde1de299: Status 404 returned error can't find the container with id b30d32e4b90a95e6ba945a958c4b4c3a240780f149e134fe97cfd30fde1de299 Apr 17 17:41:17.849270 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:17.849232 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" event={"ID":"dd0302b7-a80c-445c-ab56-c6911dbbd482","Type":"ContainerStarted","Data":"1154e312331b9181bfa6b59e739bfc3e820b2fdd52ec702487b13bace6d1df62"} Apr 17 17:41:17.849270 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:17.849273 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" event={"ID":"dd0302b7-a80c-445c-ab56-c6911dbbd482","Type":"ContainerStarted","Data":"b30d32e4b90a95e6ba945a958c4b4c3a240780f149e134fe97cfd30fde1de299"} Apr 17 17:41:18.083251 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.083211 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs"] Apr 17 17:41:18.087341 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.087314 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.089845 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.089821 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 17:41:18.097533 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.097508 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs"] Apr 17 17:41:18.156778 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.156687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7152386-31c0-473b-bbe4-df1a44e96efc-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.157101 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.157078 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.157265 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.157236 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mxxw\" (UniqueName: \"kubernetes.io/projected/b7152386-31c0-473b-bbe4-df1a44e96efc-kube-api-access-9mxxw\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.157387 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.157338 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.157451 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.157398 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.157543 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.157519 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.258885 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.258831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mxxw\" (UniqueName: \"kubernetes.io/projected/b7152386-31c0-473b-bbe4-df1a44e96efc-kube-api-access-9mxxw\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.259177 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.259159 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.259825 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.259800 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.260030 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.259872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.260148 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.260073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.260216 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.260188 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7152386-31c0-473b-bbe4-df1a44e96efc-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.260268 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.260214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.260268 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.260226 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.260603 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.260577 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.263116 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.263091 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b7152386-31c0-473b-bbe4-df1a44e96efc-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.263495 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.263451 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7152386-31c0-473b-bbe4-df1a44e96efc-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.266520 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.266493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mxxw\" (UniqueName: \"kubernetes.io/projected/b7152386-31c0-473b-bbe4-df1a44e96efc-kube-api-access-9mxxw\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs\" (UID: \"b7152386-31c0-473b-bbe4-df1a44e96efc\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.401631 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.401581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:18.574849 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.574810 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs"] Apr 17 17:41:18.578892 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:41:18.578847 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7152386_31c0_473b_bbe4_df1a44e96efc.slice/crio-44512990f72f1c860289c96ba1bae63f81caf70d49ae4cc3c10240318b848289 WatchSource:0}: Error finding container 44512990f72f1c860289c96ba1bae63f81caf70d49ae4cc3c10240318b848289: Status 404 returned error can't find the container with id 44512990f72f1c860289c96ba1bae63f81caf70d49ae4cc3c10240318b848289 Apr 17 17:41:18.857272 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.857226 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" event={"ID":"b7152386-31c0-473b-bbe4-df1a44e96efc","Type":"ContainerStarted","Data":"24303ef5e383b612146f306c9f26cc698a1b0477b07277d592062d6c2849efff"} Apr 17 17:41:18.857272 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:18.857271 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" event={"ID":"b7152386-31c0-473b-bbe4-df1a44e96efc","Type":"ContainerStarted","Data":"44512990f72f1c860289c96ba1bae63f81caf70d49ae4cc3c10240318b848289"} Apr 17 17:41:21.843955 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:21.843915 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-jtv7v" Apr 17 17:41:22.796165 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:22.796136 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:41:22.796834 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:22.796810 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:41:23.881350 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:23.881320 2577 generic.go:358] "Generic (PLEG): container finished" podID="dd0302b7-a80c-445c-ab56-c6911dbbd482" containerID="1154e312331b9181bfa6b59e739bfc3e820b2fdd52ec702487b13bace6d1df62" exitCode=0 Apr 17 17:41:23.881769 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:23.881373 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" event={"ID":"dd0302b7-a80c-445c-ab56-c6911dbbd482","Type":"ContainerDied","Data":"1154e312331b9181bfa6b59e739bfc3e820b2fdd52ec702487b13bace6d1df62"} Apr 17 17:41:24.886701 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:24.886608 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" event={"ID":"dd0302b7-a80c-445c-ab56-c6911dbbd482","Type":"ContainerStarted","Data":"e206baf2ba2b6d12bc86524e690e172ab22eea49a0be9ce6c1064b98f0c4c520"} Apr 17 17:41:24.887135 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:24.886884 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:24.888019 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:24.887995 2577 generic.go:358] "Generic (PLEG): container finished" podID="b7152386-31c0-473b-bbe4-df1a44e96efc" containerID="24303ef5e383b612146f306c9f26cc698a1b0477b07277d592062d6c2849efff" exitCode=0 Apr 17 17:41:24.888127 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:24.888066 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" event={"ID":"b7152386-31c0-473b-bbe4-df1a44e96efc","Type":"ContainerDied","Data":"24303ef5e383b612146f306c9f26cc698a1b0477b07277d592062d6c2849efff"} Apr 17 17:41:24.906188 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:24.906106 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" podStartSLOduration=8.689683319 podStartE2EDuration="8.906087165s" podCreationTimestamp="2026-04-17 17:41:16 +0000 UTC" firstStartedPulling="2026-04-17 17:41:23.882003934 +0000 UTC m=+901.603946911" lastFinishedPulling="2026-04-17 17:41:24.098407782 +0000 UTC m=+901.820350757" observedRunningTime="2026-04-17 17:41:24.905083931 +0000 UTC m=+902.627026928" watchObservedRunningTime="2026-04-17 17:41:24.906087165 +0000 UTC m=+902.628030164" Apr 17 17:41:25.892829 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:25.892794 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" event={"ID":"b7152386-31c0-473b-bbe4-df1a44e96efc","Type":"ContainerStarted","Data":"953b74cfb3b6f0f39e0712083f8d84c7c7a00533c55c224e679f6c1d8c90df6d"} Apr 17 17:41:25.914283 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:25.914219 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" podStartSLOduration=7.646223736 podStartE2EDuration="7.914202099s" podCreationTimestamp="2026-04-17 17:41:18 +0000 UTC" firstStartedPulling="2026-04-17 17:41:24.888646014 +0000 UTC m=+902.610588990" lastFinishedPulling="2026-04-17 17:41:25.156624374 +0000 UTC m=+902.878567353" observedRunningTime="2026-04-17 17:41:25.912312248 +0000 UTC m=+903.634255245" watchObservedRunningTime="2026-04-17 17:41:25.914202099 +0000 UTC m=+903.636145097" Apr 17 17:41:35.893940 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:35.893896 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:35.906260 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:35.906231 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs" Apr 17 17:41:35.907164 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:35.907143 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd" Apr 17 17:41:45.124188 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.124155 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267"] Apr 17 17:41:45.127622 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.127599 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.130838 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.130817 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 17:41:45.143113 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.142182 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267"] Apr 17 17:41:45.318072 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.318028 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.318263 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.318111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.318263 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.318147 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.318263 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.318175 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.318263 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.318192 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4sjd\" (UniqueName: \"kubernetes.io/projected/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-kube-api-access-t4sjd\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.318263 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.318212 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.419687 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.419594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.419687 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.419643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.419687 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.419666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4sjd\" (UniqueName: \"kubernetes.io/projected/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-kube-api-access-t4sjd\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.419916 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.419698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.419916 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.419746 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.420010 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.419900 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.420235 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.420213 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.420327 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.420268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.420327 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.420280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.421939 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.421919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.422194 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.422174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.430176 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.430144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4sjd\" (UniqueName: \"kubernetes.io/projected/c45076f0-8bb3-4912-b25d-89ecfb10e0ee-kube-api-access-t4sjd\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-sh267\" (UID: \"c45076f0-8bb3-4912-b25d-89ecfb10e0ee\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.446949 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.446912 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:45.584931 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.584885 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267"] Apr 17 17:41:45.587417 ip-10-0-139-96 kubenswrapper[2577]: W0417 17:41:45.587382 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc45076f0_8bb3_4912_b25d_89ecfb10e0ee.slice/crio-38e64e62ce1947ba1b07e34d96b5a98dbfa43fb66dcabecea9772c7d18c82e53 WatchSource:0}: Error finding container 38e64e62ce1947ba1b07e34d96b5a98dbfa43fb66dcabecea9772c7d18c82e53: Status 404 returned error can't find the container with id 38e64e62ce1947ba1b07e34d96b5a98dbfa43fb66dcabecea9772c7d18c82e53 Apr 17 17:41:45.972941 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.972895 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" event={"ID":"c45076f0-8bb3-4912-b25d-89ecfb10e0ee","Type":"ContainerStarted","Data":"1efa44632543f4ffd686e52549977c526bc233f0e31f28945f2640fdedab8c5b"} Apr 17 17:41:45.972941 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:45.972946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" event={"ID":"c45076f0-8bb3-4912-b25d-89ecfb10e0ee","Type":"ContainerStarted","Data":"38e64e62ce1947ba1b07e34d96b5a98dbfa43fb66dcabecea9772c7d18c82e53"} Apr 17 17:41:51.996310 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:51.996275 2577 generic.go:358] "Generic (PLEG): container finished" podID="c45076f0-8bb3-4912-b25d-89ecfb10e0ee" containerID="1efa44632543f4ffd686e52549977c526bc233f0e31f28945f2640fdedab8c5b" exitCode=0 Apr 17 17:41:51.996765 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:51.996355 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" event={"ID":"c45076f0-8bb3-4912-b25d-89ecfb10e0ee","Type":"ContainerDied","Data":"1efa44632543f4ffd686e52549977c526bc233f0e31f28945f2640fdedab8c5b"} Apr 17 17:41:53.003942 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:53.003904 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" event={"ID":"c45076f0-8bb3-4912-b25d-89ecfb10e0ee","Type":"ContainerStarted","Data":"97a0577c3eb5048d85401533bcae663f8d8ace3fb84b9fb85f35d5dbe01c1bf7"} Apr 17 17:41:53.004333 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:53.004123 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:41:53.026869 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:41:53.026809 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" podStartSLOduration=7.825816161 podStartE2EDuration="8.026787276s" podCreationTimestamp="2026-04-17 17:41:45 +0000 UTC" firstStartedPulling="2026-04-17 17:41:51.997096912 +0000 UTC m=+929.719039888" lastFinishedPulling="2026-04-17 17:41:52.198068024 +0000 UTC m=+929.920011003" observedRunningTime="2026-04-17 17:41:53.02421229 +0000 UTC m=+930.746155288" watchObservedRunningTime="2026-04-17 17:41:53.026787276 +0000 UTC m=+930.748730278" Apr 17 17:42:04.022854 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:42:04.022822 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-sh267" Apr 17 17:46:22.823840 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:46:22.823805 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:46:22.832686 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:46:22.832658 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:51:22.860123 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:51:22.860087 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:51:22.864311 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:51:22.864281 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:56:22.883703 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:56:22.883676 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 17:56:22.889382 ip-10-0-139-96 kubenswrapper[2577]: I0417 17:56:22.889358 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 18:01:22.908090 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:01:22.908051 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 18:01:22.914865 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:01:22.914834 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 18:02:20.832226 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:20.832133 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-dgnmg_b76aad08-c177-4c20-a889-5f4a1a011654/manager/0.log" Apr 17 18:02:20.959912 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:20.959884 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-5987f9d78-75gk6_a39964b6-f1a0-4f19-ba7e-102c6db18d0d/maas-api/0.log" Apr 17 18:02:21.078215 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:21.078178 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-d6bc59649-92ck8_2182b691-66b0-45db-9286-8fad7c871a7a/manager/0.log" Apr 17 18:02:21.198482 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:21.198433 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-dx9jc_2fc59a50-5d8d-473f-ac53-ca09910df3f7/manager/2.log" Apr 17 18:02:21.444339 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:21.444308 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-77fb85d776-fqdsg_9d5cd8cf-986d-4573-bd01-25c6a1932fd2/manager/0.log" Apr 17 18:02:23.140111 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:23.140080 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-bwhzb_e3cecd69-f99b-4b43-98d1-d0c4e9373192/manager/0.log" Apr 17 18:02:23.609684 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:23.609648 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6_810ca993-fb2d-427d-9e0a-dcb6508ff042/manager/0.log" Apr 17 18:02:24.347767 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:24.347731 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-26jpl_123dd202-8527-42e2-84da-fd93872dfcb8/discovery/0.log" Apr 17 18:02:24.575172 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:24.575143 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-598f578945-tdhf2_b61e46df-1ee4-4a82-bc7d-f1596bd447d5/kube-auth-proxy/0.log" Apr 17 18:02:24.835311 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:24.835272 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7dfb49b74-jlm7c_7ec34672-cf7b-48a2-a580-01d0d51e08b1/router/0.log" Apr 17 18:02:25.181635 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:25.181542 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-sh267_c45076f0-8bb3-4912-b25d-89ecfb10e0ee/storage-initializer/0.log" Apr 17 18:02:25.192088 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:25.192060 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-sh267_c45076f0-8bb3-4912-b25d-89ecfb10e0ee/main/0.log" Apr 17 18:02:25.317358 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:25.317323 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-jtv7v_d9e033e0-f02f-4990-80ec-808fd1334d64/storage-initializer/0.log" Apr 17 18:02:25.328691 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:25.328662 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-jtv7v_d9e033e0-f02f-4990-80ec-808fd1334d64/main/0.log" Apr 17 18:02:25.446218 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:25.446190 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd_dd0302b7-a80c-445c-ab56-c6911dbbd482/storage-initializer/0.log" Apr 17 18:02:25.454249 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:25.454206 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccskjsd_dd0302b7-a80c-445c-ab56-c6911dbbd482/main/0.log" Apr 17 18:02:25.571770 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:25.571741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs_b7152386-31c0-473b-bbe4-df1a44e96efc/storage-initializer/0.log" Apr 17 18:02:25.580106 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:25.580084 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-8pxfs_b7152386-31c0-473b-bbe4-df1a44e96efc/main/0.log" Apr 17 18:02:25.695912 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:25.695887 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-9krmz_3afd1417-92d5-4c5d-bcbe-14f74b58a85b/storage-initializer/0.log" Apr 17 18:02:25.710586 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:25.710524 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-9krmz_3afd1417-92d5-4c5d-bcbe-14f74b58a85b/main/0.log" Apr 17 18:02:32.480133 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:32.480100 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6mktk_74142d91-eb23-411d-8c68-16c329d30680/global-pull-secret-syncer/0.log" Apr 17 18:02:32.625909 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:32.625877 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-sn554_51db2efc-3b63-4a21-bb04-99caab75c450/konnectivity-agent/0.log" Apr 17 18:02:32.740040 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:32.739957 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-96.ec2.internal_581fa9c88cb33b3a66c7bdd6f4dd1862/haproxy/0.log" Apr 17 18:02:36.681697 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:36.681584 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-bwhzb_e3cecd69-f99b-4b43-98d1-d0c4e9373192/manager/0.log" Apr 17 18:02:36.897241 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:36.897203 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-zf8c6_810ca993-fb2d-427d-9e0a-dcb6508ff042/manager/0.log" Apr 17 18:02:38.527901 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:38.527868 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-kx9nb_c814f0cc-083d-42f2-87fb-6ac3ce3ab5bf/cluster-monitoring-operator/0.log" Apr 17 18:02:38.670216 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:38.670184 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-pdbfm_dc8d4458-96ea-4eb9-9628-355967102e97/monitoring-plugin/0.log" Apr 17 18:02:38.838561 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:38.838533 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k4jf6_44ce145d-5623-4047-927e-65d3af3448da/node-exporter/0.log" Apr 17 18:02:38.890837 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:38.890810 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k4jf6_44ce145d-5623-4047-927e-65d3af3448da/kube-rbac-proxy/0.log" Apr 17 18:02:38.931795 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:38.931764 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k4jf6_44ce145d-5623-4047-927e-65d3af3448da/init-textfile/0.log" Apr 17 18:02:39.105386 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:39.105302 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5m7hn_7da85e0c-7193-4788-a9ae-9bc72db222ca/kube-rbac-proxy-main/0.log" Apr 17 18:02:39.172563 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:39.172533 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5m7hn_7da85e0c-7193-4788-a9ae-9bc72db222ca/kube-rbac-proxy-self/0.log" Apr 17 18:02:39.237731 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:39.237701 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5m7hn_7da85e0c-7193-4788-a9ae-9bc72db222ca/openshift-state-metrics/0.log" Apr 17 18:02:39.313734 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:39.313707 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12c31a5c-562a-4116-a291-b8c4a68e7208/prometheus/0.log" Apr 17 18:02:39.342885 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:39.342860 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12c31a5c-562a-4116-a291-b8c4a68e7208/config-reloader/0.log" Apr 17 18:02:39.383582 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:39.383492 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12c31a5c-562a-4116-a291-b8c4a68e7208/thanos-sidecar/0.log" Apr 17 18:02:39.416690 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:39.416651 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12c31a5c-562a-4116-a291-b8c4a68e7208/kube-rbac-proxy-web/0.log" Apr 17 18:02:39.452213 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:39.452173 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12c31a5c-562a-4116-a291-b8c4a68e7208/kube-rbac-proxy/0.log" Apr 17 18:02:39.485674 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:39.485641 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12c31a5c-562a-4116-a291-b8c4a68e7208/kube-rbac-proxy-thanos/0.log" Apr 17 18:02:39.507765 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:39.507737 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_12c31a5c-562a-4116-a291-b8c4a68e7208/init-config-reloader/0.log" Apr 17 18:02:41.239675 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.239639 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv"] Apr 17 18:02:41.243424 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.243398 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.245927 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.245898 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8rk54\"/\"openshift-service-ca.crt\"" Apr 17 18:02:41.246044 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.245973 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8rk54\"/\"kube-root-ca.crt\"" Apr 17 18:02:41.246562 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.246545 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8rk54\"/\"default-dockercfg-tkwzj\"" Apr 17 18:02:41.251254 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.251234 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv"] Apr 17 18:02:41.327532 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.327463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-sys\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.327532 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.327528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-proc\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.327763 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.327557 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-lib-modules\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.327763 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.327575 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-podres\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.327763 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.327641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szhc2\" (UniqueName: \"kubernetes.io/projected/7eb80b67-9498-49e1-880a-bb09dffa2b9b-kube-api-access-szhc2\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.428791 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.428736 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-sys\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.428791 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.428782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-proc\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.429060 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.428820 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-lib-modules\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.429060 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.428837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-podres\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.429060 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.428868 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-sys\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.429060 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.428884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szhc2\" (UniqueName: \"kubernetes.io/projected/7eb80b67-9498-49e1-880a-bb09dffa2b9b-kube-api-access-szhc2\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.429060 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.428934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-proc\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.429060 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.428999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-podres\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.429060 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.429004 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7eb80b67-9498-49e1-880a-bb09dffa2b9b-lib-modules\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.438186 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.438149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szhc2\" (UniqueName: \"kubernetes.io/projected/7eb80b67-9498-49e1-880a-bb09dffa2b9b-kube-api-access-szhc2\") pod \"perf-node-gather-daemonset-dtfcv\" (UID: \"7eb80b67-9498-49e1-880a-bb09dffa2b9b\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.554751 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.554646 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:41.691539 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.691504 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv"] Apr 17 18:02:41.695232 ip-10-0-139-96 kubenswrapper[2577]: W0417 18:02:41.695197 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7eb80b67_9498_49e1_880a_bb09dffa2b9b.slice/crio-141d9f9df6a2b07def18f7cbc74213bf487ea5b846785caea81f2b0cafeade72 WatchSource:0}: Error finding container 141d9f9df6a2b07def18f7cbc74213bf487ea5b846785caea81f2b0cafeade72: Status 404 returned error can't find the container with id 141d9f9df6a2b07def18f7cbc74213bf487ea5b846785caea81f2b0cafeade72 Apr 17 18:02:41.696898 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:41.696877 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:02:42.570405 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:42.570354 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" event={"ID":"7eb80b67-9498-49e1-880a-bb09dffa2b9b","Type":"ContainerStarted","Data":"913631d857f5b54c970e0a692f29a032bdd243319f1e18a63fed88e46cec3000"} Apr 17 18:02:42.570912 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:42.570423 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" event={"ID":"7eb80b67-9498-49e1-880a-bb09dffa2b9b","Type":"ContainerStarted","Data":"141d9f9df6a2b07def18f7cbc74213bf487ea5b846785caea81f2b0cafeade72"} Apr 17 18:02:42.571510 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:42.571460 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:42.589917 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:42.589852 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" podStartSLOduration=1.5898371519999999 podStartE2EDuration="1.589837152s" podCreationTimestamp="2026-04-17 18:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:02:42.587805808 +0000 UTC m=+2180.309748807" watchObservedRunningTime="2026-04-17 18:02:42.589837152 +0000 UTC m=+2180.311780196" Apr 17 18:02:43.692943 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:43.692916 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-55f58_9bd2cdcc-c4b5-446f-8f64-6c123730399d/dns/0.log" Apr 17 18:02:43.723508 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:43.723484 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-55f58_9bd2cdcc-c4b5-446f-8f64-6c123730399d/kube-rbac-proxy/0.log" Apr 17 18:02:43.869592 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:43.869562 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bjxzb_ca904b14-b665-4107-bf21-c1783df952e4/dns-node-resolver/0.log" Apr 17 18:02:44.434234 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:44.434202 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5b45f68c89-wvxkd_d1f1ce73-bdd1-4cc9-8de9-b45fbaf27ce9/registry/0.log" Apr 17 18:02:44.461117 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:44.461091 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gb2kr_255274fc-6f71-45da-a2f9-c715044eee61/node-ca/0.log" Apr 17 18:02:45.582824 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:45.582774 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-26jpl_123dd202-8527-42e2-84da-fd93872dfcb8/discovery/0.log" Apr 17 18:02:45.632661 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:45.632624 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-598f578945-tdhf2_b61e46df-1ee4-4a82-bc7d-f1596bd447d5/kube-auth-proxy/0.log" Apr 17 18:02:45.758904 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:45.758861 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7dfb49b74-jlm7c_7ec34672-cf7b-48a2-a580-01d0d51e08b1/router/0.log" Apr 17 18:02:46.330145 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:46.330113 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9lmqr_802564e4-cdb1-4a5c-80f9-814bd584caa0/serve-healthcheck-canary/0.log" Apr 17 18:02:47.097486 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:47.097443 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dqmbb_021cf46a-9b84-480a-acfc-b41c0da1ca7a/kube-rbac-proxy/0.log" Apr 17 18:02:47.128631 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:47.128595 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dqmbb_021cf46a-9b84-480a-acfc-b41c0da1ca7a/exporter/0.log" Apr 17 18:02:47.159066 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:47.159030 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dqmbb_021cf46a-9b84-480a-acfc-b41c0da1ca7a/extractor/0.log" Apr 17 18:02:49.180371 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:49.180344 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-dgnmg_b76aad08-c177-4c20-a889-5f4a1a011654/manager/0.log" Apr 17 18:02:49.217528 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:49.217428 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-5987f9d78-75gk6_a39964b6-f1a0-4f19-ba7e-102c6db18d0d/maas-api/0.log" Apr 17 18:02:49.269564 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:49.269527 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-d6bc59649-92ck8_2182b691-66b0-45db-9286-8fad7c871a7a/manager/0.log" Apr 17 18:02:49.293067 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:49.293032 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-dx9jc_2fc59a50-5d8d-473f-ac53-ca09910df3f7/manager/1.log" Apr 17 18:02:49.305524 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:49.305492 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-dx9jc_2fc59a50-5d8d-473f-ac53-ca09910df3f7/manager/2.log" Apr 17 18:02:49.363983 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:49.363949 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-77fb85d776-fqdsg_9d5cd8cf-986d-4573-bd01-25c6a1932fd2/manager/0.log" Apr 17 18:02:49.587260 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:49.587229 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-dtfcv" Apr 17 18:02:57.311661 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:57.311625 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pdpfh_355b3a4d-4123-4e80-a76f-e42bcfb92020/kube-multus-additional-cni-plugins/0.log" Apr 17 18:02:57.336143 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:57.336110 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pdpfh_355b3a4d-4123-4e80-a76f-e42bcfb92020/egress-router-binary-copy/0.log" Apr 17 18:02:57.359096 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:57.359067 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pdpfh_355b3a4d-4123-4e80-a76f-e42bcfb92020/cni-plugins/0.log" Apr 17 18:02:57.382602 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:57.382573 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pdpfh_355b3a4d-4123-4e80-a76f-e42bcfb92020/bond-cni-plugin/0.log" Apr 17 18:02:57.407490 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:57.407445 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pdpfh_355b3a4d-4123-4e80-a76f-e42bcfb92020/routeoverride-cni/0.log" Apr 17 18:02:57.430743 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:57.430716 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pdpfh_355b3a4d-4123-4e80-a76f-e42bcfb92020/whereabouts-cni-bincopy/0.log" Apr 17 18:02:57.455060 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:57.455023 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pdpfh_355b3a4d-4123-4e80-a76f-e42bcfb92020/whereabouts-cni/0.log" Apr 17 18:02:57.947094 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:57.947058 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rks6d_c447c5b4-4c37-4d50-8c77-2633c36d977d/kube-multus/0.log" Apr 17 18:02:58.059499 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:58.059459 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-p9f9z_bcb4d874-10b6-4167-b452-800ed19b3f79/network-metrics-daemon/0.log" Apr 17 18:02:58.079596 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:58.079559 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-p9f9z_bcb4d874-10b6-4167-b452-800ed19b3f79/kube-rbac-proxy/0.log" Apr 17 18:02:58.954923 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:58.954889 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-controller/0.log" Apr 17 18:02:58.980317 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:58.980282 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/0.log" Apr 17 18:02:58.991522 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:58.991492 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovn-acl-logging/1.log" Apr 17 18:02:59.012800 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:59.012754 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/kube-rbac-proxy-node/0.log" Apr 17 18:02:59.040313 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:59.040280 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 18:02:59.057864 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:59.057837 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/northd/0.log" Apr 17 18:02:59.078710 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:59.078685 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/nbdb/0.log" Apr 17 18:02:59.101271 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:59.101248 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/sbdb/0.log" Apr 17 18:02:59.220696 ip-10-0-139-96 kubenswrapper[2577]: I0417 18:02:59.220591 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8rjgx_49a176d5-a780-4a38-b16f-90dc62742d5d/ovnkube-controller/0.log"